Sample records for event recorder utilizing

  1. Utilization of independent component analysis for accurate pathological ripple detection in intracranial EEG recordings recorded extra- and intra-operatively

    PubMed Central

    Shimamoto, Shoichi; Waldman, Zachary J.; Orosz, Iren; Song, Inkyung; Bragin, Anatol; Fried, Itzhak; Engel, Jerome; Staba, Richard; Sharan, Ashwini; Wu, Chengyuan; Sperling, Michael R.; Weiss, Shennan A.

    2018-01-01

    Objective To develop and validate a detector that identifies ripple (80–200 Hz) events in intracranial EEG (iEEG) recordings in a referential montage and utilizes independent component analysis (ICA) to eliminate or reduce high-frequency artifact contamination. Also, investigate the correspondence of detected ripples and the seizure onset zone (SOZ). Methods iEEG recordings from 16 patients were first band-pass filtered (80–600 Hz) and Infomax ICA was next applied to derive the first independent component (IC1). IC1 was subsequently pruned, and an artifact index was derived to reduce the identification of high-frequency events introduced by the reference electrode signal. A Hilbert detector identified ripple events in the processed iEEG recordings using amplitude and duration criteria. The identified ripple events were further classified and characterized as true or false ripple on spikes, or ripples on oscillations by utilizing a topographical analysis to their time-frequency plot, and confirmed by visual inspection. Results The signal to noise ratio was improved by pruning IC1. The precision of the detector for ripple events was 91.27 ± 4.3%, and the sensitivity of the detector was 79.4 ± 3.0% (N = 16 patients, 5842 ripple events). The sensitivity and precision of the detector was equivalent in iEEG recordings obtained during sleep or intra-operatively. Across all the patients, true ripple on spike rates and also the rates of false ripple on spikes, that were generated due to filter ringing, classified the seizure onset zone (SOZ) with an area under the receiver operating curve (AUROC) of >76%. The magnitude and spectral content of true ripple on spikes generated in the SOZ was distinct as compared with the ripples generated in the NSOZ (p < .001). Conclusions Utilizing ICA to analyze iEEG recordings in referential montage provides many benefits to the study of high-frequency oscillations. The ripple rates and properties defined using this approach may accurately delineate the seizure onset zone. Significance Strategies to improve the spatial resolution of intracranial EEG and reduce artifact can help improve the clinical utility of HFO biomarkers. PMID:29113719

  2. Utilization of independent component analysis for accurate pathological ripple detection in intracranial EEG recordings recorded extra- and intra-operatively.

    PubMed

    Shimamoto, Shoichi; Waldman, Zachary J; Orosz, Iren; Song, Inkyung; Bragin, Anatol; Fried, Itzhak; Engel, Jerome; Staba, Richard; Sharan, Ashwini; Wu, Chengyuan; Sperling, Michael R; Weiss, Shennan A

    2018-01-01

    To develop and validate a detector that identifies ripple (80-200 Hz) events in intracranial EEG (iEEG) recordings in a referential montage and utilizes independent component analysis (ICA) to eliminate or reduce high-frequency artifact contamination. Also, investigate the correspondence of detected ripples and the seizure onset zone (SOZ). iEEG recordings from 16 patients were first band-pass filtered (80-600 Hz) and Infomax ICA was next applied to derive the first independent component (IC1). IC1 was subsequently pruned, and an artifact index was derived to reduce the identification of high-frequency events introduced by the reference electrode signal. A Hilbert detector identified ripple events in the processed iEEG recordings using amplitude and duration criteria. The identified ripple events were further classified and characterized as true or false ripple on spikes, or ripples on oscillations by utilizing a topographical analysis to their time-frequency plot, and confirmed by visual inspection. The signal to noise ratio was improved by pruning IC1. The precision of the detector for ripple events was 91.27 ± 4.3%, and the sensitivity of the detector was 79.4 ± 3.0% (N = 16 patients, 5842 ripple events). The sensitivity and precision of the detector was equivalent in iEEG recordings obtained during sleep or intra-operatively. Across all the patients, true ripple on spike rates and also the rates of false ripple on spikes, that were generated due to filter ringing, classified the seizure onset zone (SOZ) with an area under the receiver operating curve (AUROC) of >76%. The magnitude and spectral content of true ripple on spikes generated in the SOZ was distinct as compared with the ripples generated in the NSOZ (p < .001). Utilizing ICA to analyze iEEG recordings in referential montage provides many benefits to the study of high-frequency oscillations. The ripple rates and properties defined using this approach may accurately delineate the seizure onset zone. Strategies to improve the spatial resolution of intracranial EEG and reduce artifact can help improve the clinical utility of HFO biomarkers. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  3. The Use of Intensity Scales In Exploiting Tsunami Historical Databases

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Scheele, F.

    2015-12-01

    Post-disaster assessments for historical tsunami events (>15 years old) are either scarce or contain limited information. In this study, we are assessing ways to examine tsunami impacts by utilizing data from old events, but more importantly we examine how to best utilize information contained in tsunami historical databases, in order to provide meaningful products that describe the impact of the event. As such, a tsunami intensity scale was applied to two historical events that were observed in New Zealand (one local and one distant), in order to utilize the largest possible number of observations in our dataset. This is especially important for countries like New Zealand where the tsunami historical record is short, going back to only the 19th century, and where instrument recordings are only available for the most recent events. We found that despite a number of challenges in using intensities -uncertainties partly due to limitations of historical event data - these data with the help of GIS tools can be used to produce hazard maps and offer an alternative way to exploit tsunami historical records. Most importantly the assignment of intensities at each point of observation allows for utilization of many more observations than if one depends on physical information alone, such as water heights. We hope these results may be used towards developing a well-defined methodology for hazard assessments, and refine our knowledge for past tsunami events for which the tsunami sources are largely unknown, and also for when physical quantities describing the tsunami (e.g. water height, flood depth, run-up) are scarce.

  4. Downhole Microseismic Monitoring at a Carbon Capture, Utilization, and Storage Site, Farnsworth Unit, Ochiltree County, Texas

    NASA Astrophysics Data System (ADS)

    Ziegler, A.; Balch, R. S.; van Wijk, J.

    2015-12-01

    Farnsworth Oil Field in North Texas hosts an ongoing carbon capture, utilization, and storage project. This study is focused on passive seismic monitoring at the carbon injection site to measure, locate, and catalog any induced seismic events. A Geometrics Geode system is being utilized for continuous recording of the passive seismic downhole bore array in a monitoring well. The array consists of 3-component dual Geospace OMNI-2400 15Hz geophones with a vertical spacing of 30.5m. Downhole temperature and pressure are also monitored. Seismic data is recorded continuously and is produced at a rate of over 900GB per month, which must be archived and reviewed. A Short Term Average/Long Term Average (STA/LTA) algorithm was evaluated for its ability to search for events, including identification and quantification of any false positive events. It was determined that the algorithm was not appropriate for event detection with the background level of noise at the field site and for the recording equipment as configured. Alternatives are being investigated. The final intended outcome of the passive seismic monitoring is to mine the continuous database and develop a catalog of microseismic events/locations and to determine if there is any relationship to CO2 injection in the field. Identifying the location of any microseismic events will allow for correlation with carbon injection locations and previously characterized geological and structural features such as faults and paleoslopes. Additionally, the borehole array has recorded over 1200 active sources with three sweeps at each source location that were acquired during a nearby 3D VSP. These data were evaluated for their usability and location within an effective radius of the array and were stacked to improve signal-noise ratio and are used to calibrate a full field velocity model to enhance event location accuracy. Funding for this project is provided by the U.S. Department of Energy under Award No. DE-FC26-05NT42591.

  5. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    NASA Astrophysics Data System (ADS)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.

  6. Methodological issues on the use of administrative data in healthcare research: the case of heart failure hospitalizations in Lombardy region, 2000 to 2012.

    PubMed

    Mazzali, Cristina; Paganoni, Anna Maria; Ieva, Francesca; Masella, Cristina; Maistrello, Mauro; Agostoni, Ornella; Scalvini, Simonetta; Frigerio, Maria

    2016-07-08

    Administrative data are increasingly used in healthcare research. However, in order to avoid biases, their use requires careful study planning. This paper describes the methodological principles and criteria used in a study on epidemiology, outcomes and process of care of patients hospitalized for heart failure (HF) in the largest Italian Region, from 2000 to 2012. Data were extracted from the administrative data warehouse of the healthcare system of Lombardy, Italy. Hospital discharge forms with HF-related diagnosis codes were the basis for identifying HF hospitalizations as clinical events, or episodes. In patients experiencing at least one HF event, hospitalizations for any cause, outpatient services utilization, and drug prescriptions were also analyzed. Seven hundred one thousand, seven hundred one heart failure events involving 371,766 patients were recorded from 2000 to 2012. Once all the healthcare services provided to these patients after the first HF event had been joined together, the study database totalled about 91 million records. Principles, criteria and tips utilized in order to minimize errors and characterize some relevant subgroups are described. The methodology of this study could represent the basis for future research and could be applied in similar studies concerning epidemiology, trend analysis, and healthcare resources utilization.

  7. Utility of Continuous EEG Monitoring in Noncritically lll Hospitalized Patients.

    PubMed

    Billakota, Santoshi; Sinha, Saurabh R

    2016-10-01

    Continuous EEG (cEEG) monitoring is used in the intensive care unit (ICU) setting to detect seizures, especially nonconvulsive seizures and status epilepticus. The utility and impact of such monitoring in non-ICU patients are largely unknown. Hospitalized patients who were not in an ICU and underwent cEEG monitoring in the first half of 2011 and 2014 were identified. Reason for admission, admitting service (neurologic and nonneurologic), indication for cEEG, comorbid conditions, duration of recording, EEG findings, whether an event/seizure was recorded, and impact of EEG findings on management were reviewed. We evaluated the impact of the year of recording, admitting service, indication for cEEG, and neurologic comorbidity on the yield of recordings based on whether an event was captured and/or a change in antiepileptic drug management occurred. Two hundred forty-nine non-ICU patients had cEEG monitoring during these periods. The indication for cEEG was altered mental status (60.6%), observed seizures (26.5%), or observed spells (12.9%); 63.5% were on neuro-related services. The average duration of recording was 1.8 days. EEG findings included interictal epileptiform discharges (14.9%), periodic lateralized discharges (4%), and generalized periodic discharges (1.6%). Clinical events were recorded in 28.1% and seizures in 16.5%. The cEEG led to a change in antiepileptic drug management in 38.6% of patients. There was no impact of type of admitting service; there was no significant impact of indication for cEEG. In non-ICU patients, cEEG monitoring had a relatively high yield of event/seizures (similar to ICU) and impact on management. Temporal trends, admitting service, and indication for cEEG did not alter this.

  8. Environmental Conditions and Seasonal Variables in American Youth Football Leagues.

    PubMed

    Yeargin, Susan W; Cahoon, Erin; Hosokawa, Yuri; Mensch, James M; Dompier, Thomas P; Kerr, Zachary Y

    2017-11-01

    Our study describes youth football (YFB) environmental conditions and the associated heat index (HI) risk category. An observational research design was utilized. Independent variables included month, time, event, and geographic location. Main outcome variables were frequency of events, average HI, and corresponding risk categorization. The HI was recorded with the day and time for each YFB event across 2 YFB seasons. Nearly half (49.8%) of events were in a high HI risk category and 20.0% should have been cancelled. The hottest HI values were recorded in July and August (83.2 ± 9.4°F to 87.2 ± 10.9°F; 24.0% of YFB events). The 7 to 10 am time frame was cooler (67.7 ± 14.5°F; 6.3% of YFB events) than other time frames ( P < .001). Hotter HI values were recorded in practices versus games (75.9 ± 14.1°F vs 70.6 ± 14.6°F; t = -6.426, P < .001). Starting the YFB season in September and holding weekend events in the early morning hours can decrease exposure to environmental heat stress.

  9. SEISMIC STUDY OF THE AGUA DE PAU GEOTHERMAL PROSPECT, SAO MIGUEL, AZORES.

    USGS Publications Warehouse

    Dawson, Phillip B.; Rodrigues da Silva, Antonio; Iyer, H.M.; Evans, John R.

    1985-01-01

    A 16 station array was operated over the 200 km**2 central portion of Sao Miguel utilizing 8 permanent Instituto Nacional de Meterologia e Geofisica stations and 8 USGS portable stations. Forty four local events with well constrained solutions and 15 regional events were located. In addition, hundreds of unlocatable seismic events were recorded. The most interesting seismic activity occurred in a swarm on September 6 and 7, 1983 when over 200 events were recorded in a 16 hour period. The seismic activity around Agua de Pau was centered on the east and northeast slopes of the volcano. The data suggest a boiling hydrothermal system beneath the Agua de Pau volcano, consistent with a variety of other data.

  10. Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise

    NASA Astrophysics Data System (ADS)

    Ziegler, Abra E.

    The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was quantitatively evaluated during a variety of noise conditions and seismic detections were identified using AST and compared to ancillary injection data. During a period of CO2 injection in a nearby well to the monitoring array, 82% of seismic events were accurately detected, 13% of events were missed, and 5% of detections were determined to be false. Additionally, seismic risk was evaluated from the stress field and faulting regime at FWU to determine the likelihood of pressure perturbations to trigger slip on previously mapped faults. Faults oriented NW-SE were identified as requiring the smallest pore pressure changes to trigger slip and faults oriented N-S will also potentially be reactivated although this is less likely.

  11. Murder and Mayhem. "The Great Gatsby": The Facts Behind the Fiction. Learning Page Lesson Plan.

    ERIC Educational Resources Information Center

    Rohrbach, Margie; Koszoru, Janie

    To appreciate historical fiction, students need to understand the factual context and recognize how popular culture reflects the values, mores, and events of the time period. Since a newspaper records significant events and attitudes representative of a period, students create their own newspapers, utilizing primary source materials from several…

  12. The preliminary development and testing of a global trigger tool to detect error and patient harm in primary-care records.

    PubMed

    de Wet, C; Bowie, P

    2009-04-01

    A multi-method strategy has been proposed to understand and improve the safety of primary care. The trigger tool is a relatively new method that has shown promise in American and secondary healthcare settings. It involves the focused review of a random sample of patient records using a series of "triggers" that alert reviewers to potential errors and previously undetected adverse events. To develop and test a global trigger tool to detect errors and adverse events in primary-care records. Trigger tool development was informed by previous research and content validated by expert opinion. The tool was applied by trained reviewers who worked in pairs to conduct focused audits of 100 randomly selected electronic patient records in each of five urban general practices in central Scotland. Review of 500 records revealed 2251 consultations and 730 triggers. An adverse event was found in 47 records (9.4%), indicating that harm occurred at a rate of one event per 48 consultations. Of these, 27 were judged to be preventable (42%). A further 17 records (3.4%) contained evidence of a potential adverse event. Harm severity was low to moderate for most patients (82.9%). Error and harm rates were higher in those aged > or =60 years, and most were medication-related (59%). The trigger tool was successful in identifying undetected patient harm in primary-care records and may be the most reliable method for achieving this. However, the feasibility of its routine application is open to question. The tool may have greater utility as a research rather than an audit technique. Further testing in larger, representative study samples is required.

  13. A WiFi public address system for disaster management.

    PubMed

    Andrade, Nicholas; Palmer, Douglas A; Lenert, Leslie A

    2006-01-01

    The WiFi Bullhorn is designed to assist emergency workers in the event of a disaster situation by offering a rapidly configurable wireless of public address system for disaster sites. The current configuration plays either pre recorded or custom recorded messages and utilizes 802.11b networks for communication. Units can be position anywhere wireless coverage exists to help manage crowds or to recall first responders from dangerous areas.

  14. A WiFi Public Address System for Disaster Management

    PubMed Central

    Andrade, Nicholas; Palmer, Douglas A.; Lenert, Leslie A.

    2006-01-01

    The WiFi Bullhorn is designed to assist emergency workers in the event of a disaster situation by offering a rapidly configurable wireless public address system for disaster sites. The current configuration plays either pre recorded or custom recorded messages and utilizes 802.11b networks for communication. Units can be position anywhere wireless coverage exists to help manage crowds or to recall first responders from dangerous areas. PMID:17238466

  15. Source Characterization and Seismic Hazard Considerations for Hydraulic Fracture Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Bosman, K.; Viegas, G. F.; Baig, A. M.; Urbancic, T.

    2015-12-01

    Large microseismic events (M>0) have been shown to be generated during hydraulic fracture treatments relatively frequently. These events are a concern both from public safety and engineering viewpoints. Recent microseismic monitoring projects in the Horn River Basin have utilized both downhole and surface sensors to record events associated with hydraulic fracturing. The resulting hybrid monitoring system has produced a large dataset with two distinct groups of events: large events recorded by the surface network (0

  16. Paleogeodetic records of seismic and aseismic subduction from central Sumatran microatolls, Indonesia

    USGS Publications Warehouse

    Natawidjaja, D.H.; Sieh, K.; Ward, S.N.; Cheng, H.; Edwards, R. Lawrence; Galetzka, J.; Suwargadi, B.W.

    2004-01-01

    We utilize coral microatolls in western Sumatra to document vertical deformation associated with subduction. Microatolls are very sensitive to fluctuations in sea level and thus act as natural tide gauges. They record not only the magnitude of vertical deformation associated with earthquakes (paleoseismic data), but also continuously track the long-term aseismic deformation that occurs during the intervals between earthquakes (paleogeodetic data). This paper focuses on the twentieth century paleogeodetic history of the equatorial region. Our coral paleogeodetic record of the 1935 event reveals a classical example of deformations produced by seismic rupture of a shallow subduction interface. The site closest to the trench rose 90 cm, whereas sites further east sank by as much as 35 cm. Our model reproduces these paleogeodetic data with a 2.3 m slip event on the interface 88 to 125 km from the trench axis. Our coral paleogeodetic data reveal slow submergence during the decades before and after the event in the areas of coseismic emergence. Likewise, interseismic emergence occurred before and after the 1935 event in areas of coseismic submergence. Among the interesting phenomenon we have discovered in the coral record is evidence of a large aseismic slip or "silent even" in 1962, 27 years after the 1935 event. Paleogeodetic deformation rates in the decades before, after, and between the 1935 and 1962 events have varied both temporally and spatially. During the 25 years following the 1935 event, submergence rates were dramatically greater than in prior decades. During the past four decades, however, rates have been lower than in the preceding decades, but are still higher than they were prior to 1935. These paleogeodetic records enable us to model the kinematics of the subduction interface throughout the twentieth century. Copyright 2004 by the American Geophysical Union.

  17. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  18. A computer aided treatment event recognition system in radiation therapy.

    PubMed

    Xia, Junyi; Mart, Christopher; Bayouth, John

    2014-01-01

    To develop an automated system to safeguard radiation therapy treatments by analyzing electronic treatment records and reporting treatment events. CATERS (Computer Aided Treatment Event Recognition System) was developed to detect treatment events by retrieving and analyzing electronic treatment records. CATERS is designed to make the treatment monitoring process more efficient by automating the search of the electronic record for possible deviations from physician's intention, such as logical inconsistencies as well as aberrant treatment parameters (e.g., beam energy, dose, table position, prescription change, treatment overrides, etc). Over a 5 month period (July 2012-November 2012), physicists were assisted by the CATERS software in conducting normal weekly chart checks with the aims of (a) determining the relative frequency of particular events in the authors' clinic and (b) incorporating these checks into the CATERS. During this study period, 491 patients were treated at the University of Iowa Hospitals and Clinics for a total of 7692 fractions. All treatment records from the 5 month analysis period were evaluated using all the checks incorporated into CATERS after the training period. About 553 events were detected as being exceptions, although none of them had significant dosimetric impact on patient treatments. These events included every known event type that was discovered during the trial period. A frequency analysis of the events showed that the top three types of detected events were couch position override (3.2%), extra cone beam imaging (1.85%), and significant couch position deviation (1.31%). The significant couch deviation is defined as the number of treatments where couch vertical exceeded two times standard deviation of all couch verticals, or couch lateral/longitudinal exceeded three times standard deviation of all couch laterals and longitudinals. On average, the application takes about 1 s per patient when executed on either a desktop computer or a mobile device. CATERS offers an effective tool to detect and report treatment events. Automation and rapid processing enables electronic record interrogation daily, alerting the medical physicist of deviations potentially days prior to performing weekly check. The output of CATERS could also be utilized as an important input to failure mode and effects analysis.

  19. Overrepresentation of extreme events in decision making reflects rational use of cognitive resources.

    PubMed

    Lieder, Falk; Griffiths, Thomas L; Hsu, Ming

    2018-01-01

    People's decisions and judgments are disproportionately swayed by improbable but extreme eventualities, such as terrorism, that come to mind easily. This article explores whether such availability biases can be reconciled with rational information processing by taking into account the fact that decision makers value their time and have limited cognitive resources. Our analysis suggests that to make optimal use of their finite time decision makers should overrepresent the most important potential consequences relative to less important, put potentially more probable, outcomes. To evaluate this account, we derive and test a model we call utility-weighted sampling. Utility-weighted sampling estimates the expected utility of potential actions by simulating their outcomes. Critically, outcomes with more extreme utilities have a higher probability of being simulated. We demonstrate that this model can explain not only people's availability bias in judging the frequency of extreme events but also a wide range of cognitive biases in decisions from experience, decisions from description, and memory recall. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. High speed imaging - An important industrial tool

    NASA Technical Reports Server (NTRS)

    Moore, Alton; Pinelli, Thomas E.

    1986-01-01

    High-speed photography, which is a rapid sequence of photographs that allow an event to be analyzed through the stoppage of motion or the production of slow-motion effects, is examined. In high-speed photography 16, 35, and 70 mm film and framing rates between 64-12,000 frames per second are utilized to measure such factors as angles, velocities, failure points, and deflections. The use of dual timing lamps in high-speed photography and the difficulties encountered with exposure and programming the camera and event are discussed. The application of video cameras to the recording of high-speed events is described.

  1. YALINA-booster subcritical assembly pulsed-neutron e xperiments: detector dead time and apatial corrections.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Y.; Gohar, Y.; Nuclear Engineering Division

    In almost every detector counting system, a minimal dead time is required to record two successive events as two separated pulses. Due to the random nature of neutron interactions in the subcritical assembly, there is always some probability that a true neutron event will not be recorded because it occurs too close to the preceding event. These losses may become rather severe for counting systems with high counting rates, and should be corrected before any utilization of the experimental data. This report examines the dead time effects for the pulsed neutron experiments of the YALINA-Booster subcritical assembly. The nonparalyzable modelmore » is utilized to correct the experimental data due to dead time. Overall, the reactivity values are increased by 0.19$ and 0.32$ after the spatial corrections for the YALINA-Booster 36% and 21% configurations respectively. The differences of the reactivities obtained with He-3 long or short detectors at the same detector channel diminish after the dead time corrections of the experimental data for the 36% YALINA-Booster configuration. In addition, better agreements between reactivities obtained from different experimental data sets are also observed after the dead time corrections for the 21% YALINA-Booster configuration.« less

  2. Diagnostic Utility of Wireless Video-Electroencephalography in Unsedated Dogs.

    PubMed

    James, F M K; Cortez, M A; Monteith, G; Jokinen, T S; Sanders, S; Wielaender, F; Fischer, A; Lohi, H

    2017-09-01

    Poor agreement between observers on whether an unusual event is a seizure drives the need for a specific diagnostic tool provided by video-electroencephalography (video-EEG) in human pediatric epileptology. That successful classification of events would be positively associated with increasing EEG recording length and higher event frequency reported before video-EEG evaluation; that a novel wireless video-EEG technique would clarify whether unusual behavioral events were seizures in unsedated dogs. Eighty-one client-owned dogs of various breeds undergoing investigation of unusual behavioral events at 4 institutions. Retrospective case series: evaluation of wireless video-EEG recordings in unsedated dogs performed at 4 institutions. Electroencephalography achieved/excluded diagnosis of epilepsy in 58 dogs (72%); 25 dogs confirmed with epileptic seizures based on ictal/interictal epileptiform discharges, and 33 dogs with no EEG abnormalities associated with their target events. As reported frequency of the target events decreased (annually, monthly, weekly, daily, hourly, minutes, seconds), EEG was less likely to achieve diagnosis (P < 0.001). Every increase in event frequency increased the odds of achieving diagnosis by 2.315 (95% confidence interval: 1.36-4.34). EEG recording length (mean = 3.69 hours, range: 0.17-22.5) was not associated (P = 0.2) with the likelihood of achieving a diagnosis. Wireless video-EEG in unsedated dogs had a high success for diagnosis of unusual behavioral events. This technique offered a reliable clinical tool to investigate the epileptic origin of behavioral events in dogs. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  3. Earthquake recording at the Stanford DAS Array with fibers in existing telecomm conduits

    NASA Astrophysics Data System (ADS)

    Biondi, B. C.; Martin, E. R.; Yuan, S.; Cole, S.; Karrenbach, M. H.

    2017-12-01

    The Stanford Distributed Acoustic Sensing Array (SDASA-1) has been continuously recording seismic data since September 2016 on 2.5 km of single mode fiber optics in existing telecommunications conduits under Stanford's campus. The array is figure-eight shaped and roughly 600 m along its widest side with a channel spacing of roughly 8 m. This array is easy to maintain and is nonintrusive, making it well suited to urban environments, but it sacrifices some cable-to-ground coupling compared to more traditional seismometers. We have been testing its utility for earthquake recording, active seismic, and ambient noise interferometry. This talk will focus on earthquake observations. We will show comparisons between the strain rates measured throughout the DAS array and the particle velocities measured at the nearby Jasper Ridge Seismic Station (JRSC). In some of these events, we will point out directionality features specific to DAS that can require slight modifications in data processing. We also compare repeatability of DAS and JRSC recordings of blasts from a nearby quarry. Using existing earthquake databases, we have created a small catalog of DAS earthquake observations by pulling records of over 700 Northern California events spanning Sep. 2016 to Jul. 2017 from both the DAS data and JRSC. On these events we have tested common array methods for earthquake detection and location including beamforming and STA/LTA analysis in time and frequency. We have analyzed these events to approximate thresholds on what distances and magnitudes are clearly detectible by the DAS array. Further analysis should be done on detectability with methods tailored to small events (for example, template matching). In creating this catalog, we have developed open source software available for free download that can manage large sets of continuous seismic data files (both existing files, and files as they stream in). This software can both interface with existing earthquake networks, and efficiently extract earthquake recordings from many continuous recordings saved on the users machines.

  4. The effect of bioturbation in pelagic sediments: Lessons from radioactive tracers and planktonic foraminifera in the Gulf of Aqaba, Red Sea

    NASA Astrophysics Data System (ADS)

    Steiner, Zvi; Lazar, Boaz; Levi, Shani; Tsroya, Shimon; Pelled, Omer; Bookman, Revital; Erez, Jonathan

    2016-12-01

    Studies of recent environmental perturbations often rely on data derived from marine sedimentary records. These records are known to imperfectly inscribe the true sequence of events, yet there is large uncertainty regarding the corrections that should be employed to accurately describe the sedimentary history. Here we show in recent records from the Gulf of Aqaba, Red Sea, how events of the abrupt disappearance of the planktonic foraminifer Globigerinoides sacculifer, and episodic deposition of the artificial radionuclide 137Cs, are significantly altered in the sedimentary record compared to their known past timing. Instead of the abrupt disappearance of the foraminifera, we observe a prolonged decline beginning at core depth equivalent to ∼30 y prior to its actual disappearance and continuing for decades past the event. We further observe asymmetric smoothing of the radionuclide peak. Utilization of advection-diffusion-reaction models to reconstruct the original fluxes based on the known absolute timing of the events reveal that it is imperative to use a continuous function to describe bioturbation. Discretization of bioturbation into mixed and unmixed layers significantly shifts the location of the modeled event. When bioturbation is described as a continuously decreasing function of depth, the peak of a very short term event smears asymmetrically but remains in the right depth. When sudden events repeat while the first spike is still mixed with the upper sediment layer, bioturbation unifies adjacent peaks. The united peak appears at an intermediate depth that does not necessarily correlate with the timing of the individual events. In a third case, a long lasting sedimentary event affected by bioturbation, the resulting peak is rather weak compared to the actual event and appears deeper in the sediment column than expected based on the termination of the event. The model clearly shows that abrupt changes can only endure in the record if a thick sediment layer settled on the sediment-water interface at once or if bioturbation rates decreased to very low values for a prolonged period of time. In any other case smearing by bioturbation makes an abrupt event appear to have started shortly before the real timing and end long after its true termination.

  5. Paleoecology, Biostratigraphy, and Response of Calcareous Nannoplankton Communities to Climate Fluctuations during the Late Oligocene in the Tropics

    NASA Astrophysics Data System (ADS)

    Aljahdali, M. H.; Wise, S. W.

    2015-12-01

    The earliest Oligocene is considered the time that the Cenozoic icehouse world was initiated when the Antarctic continental ice sheet first reached sea level. Subsequently during the Oligocene, climate then fluctuated between glacial (Oi) and warming events as recorded by stable isotopes. Relatively little is known about the paleoecological response of calcareous nannoplankton at low latitudes during these climate deteriorations. Here we investigate the biotic response along with the stable-isotope (δ18O and δ13C) record and multivariate analyses from four ODP and IODP sites cored in three oceans along the tropical belt through strata 24-30 Ma in age. Within this time frame, two major climatic shifts occurred, the Oi-2b glacial event and the Late Oligocene Warming Event (LOWE). During the Oi-events (26.5-30 Ma) temperate-water taxa associated with eutrophic taxa dominated the overall assemblage, suggesting that relatively cooler water rich in nutrients invaded the tropical region. In contrast, during the LOWE (24-26.5 Ma), a major turnover between temperate-water taxa and warm-water taxa occurred when the surface waters became warm and oligotrophic in nature. Additionally, several increases in both abundance and size were recorded through the upper Oligocene including increased abundance in Sphenolithus predistentus, a major biostratigraphic marker in the upper Oligocene, and increased size in S. moriformis. Moreover, a new additional major biostratigraphic event in the upper Oligocene was recorded; Crassidiscus backmanii shows a very short range at low latitudes. These paleoecological responses can be utilized to construct a detailed global late Oligocene biostratigraphy throughout the tropics.

  6. Nursing Student Experiences Regarding Safe Use of Electronic Health Records: A Pilot Study of the Safety and Assurance Factors for EHR Resilience Guides.

    PubMed

    Whitt, Karen J; Eden, Lacey; Merrill, Katreena Collette; Hughes, Mckenna

    2017-01-01

    Previous research has linked improper electronic health record configuration and use with adverse patient events. In response to this problem, the US Office of the National Coordinator for Health Information Technology developed the Safety and Assurance Factors for EHR Resilience guides to evaluate electronic health records for optimal use and safety features. During the course of their education, nursing students are exposed to a variety of clinical practice settings and electronic health records. This descriptive study evaluated 108 undergraduate and 51 graduate nursing students' ratings of electronic health record features and safe practices, as well as what they learned from utilizing the computerized provider order entry and clinician communication Safety and Assurance Factors for EHR Resilience guide checklists. More than 80% of the undergraduate and 70% of the graduate students reported that they experienced user problems with electronic health records in the past. More than 50% of the students felt that electronic health records contribute to adverse patient outcomes. Students reported that many of the features assessed were not fully implemented in their electronic health record. These findings highlight areas where electronic health records can be improved to optimize patient safety. The majority of students reported that utilizing the Safety and Assurance Factors for EHR Resilience guides increased their understanding of electronic health record features.

  7. Clinopyroxene Diffusion Chronometry of the Scaup Lake Rhyolite, Yellowstone Caldera, WY

    NASA Astrophysics Data System (ADS)

    Brugman, K. K.; Till, C. B.; Bose, M.

    2016-12-01

    Eruption of the Scaup Lake flow (SCL) ended 220,000 years of dormancy and began the youngest sequence of eruptions at Yellowstone caldera [Christiansen et al., USGS, 2007]. Quantification of the time intervals between magmatic events and eruption recorded in SCL is critical to interpreting signs of unrest at modern-day Yellowstone. SCL rhyolite includes zoned phenocrysts and accessory phases that indicate multiple rejuvenation events occurred shortly before eruption; previous studies focused on feldspar and zircon crystal records [e.g. Bindeman et al., J.Pet, 2008; Till et al., Geology, 2015]. Here we exploit zoned clinopyroxene (cpx)—one of the earliest-crystalized minerals in SCL as indicated by petrographic relationships—as a diffusion dating tool and utilize elements with different diffusivities to more precisely resolve rejuvenation-eruption timescales. Using NanoSIMS concentration profiles with 300-900 nanometer spacing, we employ the slower-diffusing REE Ce as a proxy for the initial profile shape of faster-diffusing Fe to calculate diffusive timescales. The outermost resolvable zone boundary in SCL cpx yields a rejuvenation-eruption timescale of 166 ± 80 yrs (1 SD). In comparison, modeling relaxation of Fe from a step function initial condition at the same temperature (920°C) yields a less precise timescale of 488 +9000 -300 yrs. Examination of our results, in concert with observed petrographic relationships, indicates SCL cpx may record an older, separate rejuvenation event than those recorded in feldspar rims at < 10 months and 10-40 years prior to eruption [Till et al., Geology, 2015]. The difference in the youngest recorded event between feldspar and cpx may be due to different crystallization intervals for these phases and/or slower crystal growth rates for cpx relative to feldspar. Our diffusion modeling results reinforce that intracrystalline zoning timescales modeled using a step function initial condition should be considered maxima, especially in viscous rhyolitic magmas, and that different phases may not record the same series of pre-eruptive events due to differences in crystallization behavior.

  8. Identification of incident poisoning, fracture and burn events using linked primary care, secondary care and mortality data from England: implications for research and surveillance.

    PubMed

    Baker, Ruth; Tata, Laila J; Kendrick, Denise; Orton, Elizabeth

    2016-02-01

    English national injury data collection systems are restricted to hospitalisations and deaths. With recent linkage of a large primary care database, the Clinical Practice Research Datalink (CPRD), with secondary care and mortality data, we aimed to assess the utility of linked data for injury research and surveillance by examining recording patterns and comparing incidence of common injuries across data sources. The incidence of poisonings, fractures and burns was estimated for a cohort of 2 147 853 0-24 year olds using CPRD linked to Hospital Episode Statistics (HES) and Office for National Statistics (ONS) mortality data between 1997 and 2012. Time-based algorithms were developed to identify incident events, distinguishing between repeat follow-up records for the same injury and those for a new event. We identified 42 985 poisoning, 185 517 fracture and 36 719 burn events in linked CPRD-HES-ONS data; incidence rates were 41.9 per 10 000 person-years (95% CI 41.4 to 42.4), 180.8 (179.8-181.7) and 35.8 (35.4-36.1), respectively. Of the injuries, 22 628 (53%) poisonings, 139 662 (75%) fractures and 33 462 (91%) burns were only recorded within CPRD. Only 16% of deaths from poisoning (n=106) or fracture (n=58) recorded in ONS were recorded within CPRD and/or HES records. None of the 10 deaths from burns were recorded in CPRD or HES records. It is essential to use linked primary care, hospitalisation and deaths data to estimate injury burden, as many injury events are only captured within a single data source. Linked routinely collected data offer an immediate and affordable mechanism for injury surveillance and analyses of population-based injury epidemiology in England. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. Quantifying human-environment interactions using videography in the context of infectious disease transmission.

    PubMed

    Julian, Timothy R; Bustos, Carla; Kwong, Laura H; Badilla, Alejandro D; Lee, Julia; Bischel, Heather N; Canales, Robert A

    2018-05-08

    Quantitative data on human-environment interactions are needed to fully understand infectious disease transmission processes and conduct accurate risk assessments. Interaction events occur during an individual's movement through, and contact with, the environment, and can be quantified using diverse methodologies. Methods that utilize videography, coupled with specialized software, can provide a permanent record of events, collect detailed interactions in high resolution, be reviewed for accuracy, capture events difficult to observe in real-time, and gather multiple concurrent phenomena. In the accompanying video, the use of specialized software to capture humanenvironment interactions for human exposure and disease transmission is highlighted. Use of videography, combined with specialized software, allows for the collection of accurate quantitative representations of human-environment interactions in high resolution. Two specialized programs include the Virtual Timing Device for the Personal Computer, which collects sequential microlevel activity time series of contact events and interactions, and LiveTrak, which is optimized to facilitate annotation of events in real-time. Opportunities to annotate behaviors at high resolution using these tools are promising, permitting detailed records that can be summarized to gain information on infectious disease transmission and incorporated into more complex models of human exposure and risk.

  10. Database for earthquake strong motion studies in Italy

    USGS Publications Warehouse

    Scasserra, G.; Stewart, J.P.; Kayen, R.E.; Lanzo, G.

    2009-01-01

    We describe an Italian database of strong ground motion recordings and databanks delineating conditions at the instrument sites and characteristics of the seismic sources. The strong motion database consists of 247 corrected recordings from 89 earthquakes and 101 recording stations. Uncorrected recordings were drawn from public web sites and processed on a record-by-record basis using a procedure utilized in the Next-Generation Attenuation (NGA) project to remove instrument resonances, minimize noise effects through low- and high-pass filtering, and baseline correction. The number of available uncorrected recordings was reduced by 52% (mostly because of s-triggers) to arrive at the 247 recordings in the database. The site databank includes for every recording site the surface geology, a measurement or estimate of average shear wave velocity in the upper 30 m (Vs30), and information on instrument housing. Of the 89 sites, 39 have on-site velocity measurements (17 of which were performed as part of this study using SASW techniques). For remaining sites, we estimate Vs30 based on measurements on similar geologic conditions where available. Where no local velocity measurements are available, correlations with surface geology are used. Source parameters are drawn from databanks maintained (and recently updated) by Istituto Nazionale di Geofisica e Vulcanologia and include hypocenter location and magnitude for small events (M< ??? 5.5) and finite source parameters for larger events. ?? 2009 A.S. Elnashai & N.N. Ambraseys.

  11. A retrospective study on the incidences of adverse drug events and analysis of the contributing trigger factors

    PubMed Central

    Sam, Aaseer Thamby; Lian Jessica, Looi Li; Parasuraman, Subramani

    2015-01-01

    Objectives: To retrospectively determine the extent and types of adverse drug events (ADEs) from the patient cases sheets and identify the contributing factors of medication errors. To assess causality and severity using the World Health Organization (WHO) probability scale and Hartwig's scale, respectively. Methods: Hundred patient case sheets were randomly selected, modified version of the Institute for Healthcare Improvement (IHI) Global Trigger Tool was utilized to identify the ADEs; causality and severity were calculated utilizing the WHO probability scale and Hartwig's severity assessment scale, respectively. Results: In total, 153 adverse events (AEs) were identified using the IHI Global Trigger Tool. Majority of the AEs are due to medication errors (46.41%) followed by 60 adverse drug reactions (ADRs), 15 therapeutic failure incidents, and 7 over-dose cases. Out of the 153 AEs, 60 are due to ADRs such as rashes, nausea, and vomiting. Therapeutic failure contributes 9.80% of the AEs, while overdose contributes to 4.58% of the total 153 AEs. Using the trigger tools, we were able to detect 45 positive triggers in 36 patient records. Among it, 19 AEs were identified in 15 patient records. The percentage of AE/100 patients is 17%. The average ADEs/1000 doses is 2.03% (calculated). Conclusion: The IHI Global Trigger Tool is an effective method to aid provisionally-registered pharmacists to identify ADEs quicker. PMID:25767366

  12. Utilization of Satellite Data to Identify and Monitor Changes in Frequency of Meteorological Events

    NASA Astrophysics Data System (ADS)

    Mast, J. C.; Dessler, A. E.

    2017-12-01

    Increases in temperature and climate variability due to human-induced climate change is increasing the frequency and magnitude of extreme heat events (i.e., heatwaves). This will have a detrimental impact on the health of human populations and habitability of certain land locations. Here we seek to utilize satellite data records to identify and monitor extreme heat events. We analyze satellite data sets (MODIS and AIRS land surface temperatures (LST) and water vapor profiles (WV)) due to their global coverage and stable calibration. Heat waves are identified based on the frequency of maximum daily temperatures above a threshold, determined as follows. Land surface temperatures are gridded into uniform latitude/longitude bins. Maximum daily temperatures per bin are determined and probability density functions (PDF) of these maxima are constructed monthly and seasonally. For each bin, a threshold is calculated at the 95th percentile of the PDF of maximum temperatures. Per each bin, an extreme heat event is defined based on the frequency of monthly and seasonal days exceeding the threshold. To account for the decreased ability of the human body to thermoregulate with increasing moisture, and to assess lethality of the heat events, we determine the wet-bulb temperature at the locations of extreme heat events. Preliminary results will be presented.

  13. Key design elements of a data utility for national biosurveillance: event-driven architecture, caching, and Web service model.

    PubMed

    Tsui, Fu-Chiang; Espino, Jeremy U; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M

    2005-01-01

    The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance-systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions.

  14. Development of an algorithm for automatic detection and rating of squeak and rattle events

    NASA Astrophysics Data System (ADS)

    Chandrika, Unnikrishnan Kuttan; Kim, Jay H.

    2010-10-01

    A new algorithm for automatic detection and rating of squeak and rattle (S&R) events was developed. The algorithm utilizes the perceived transient loudness (PTL) that approximates the human perception of a transient noise. At first, instantaneous specific loudness time histories are calculated over 1-24 bark range by applying the analytic wavelet transform and Zwicker loudness transform to the recorded noise. Transient specific loudness time histories are then obtained by removing estimated contributions of the background noise from instantaneous specific loudness time histories. These transient specific loudness time histories are summed to obtain the transient loudness time history. Finally, the PTL time history is obtained by applying Glasberg and Moore temporal integration to the transient loudness time history. Detection of S&R events utilizes the PTL time history obtained by summing only 18-24 barks components to take advantage of high signal-to-noise ratio in the high frequency range. A S&R event is identified when the value of the PTL time history exceeds the detection threshold pre-determined by a jury test. The maximum value of the PTL time history is used for rating of S&R events. Another jury test showed that the method performs much better if the PTL time history obtained by summing all frequency components is used. Therefore, r ating of S&R events utilizes this modified PTL time history. Two additional jury tests were conducted to validate the developed detection and rating methods. The algorithm developed in this work will enable automatic detection and rating of S&R events with good accuracy and minimum possibility of false alarm.

  15. Quality of Green's Functions Improved by Automatic Detection and Removal of Coherent Anthropogenic Noise

    NASA Astrophysics Data System (ADS)

    Williams, E. F.; Martin, E. R.; Biondi, B. C.; Lindsey, N.; Ajo Franklin, J. B.; Wagner, A. M.; Bjella, K.; Daley, T. M.; Dou, S.; Freifeld, B. M.; Robertson, M.; Ulrich, C.

    2016-12-01

    We analyze the impact of identifying and removing coherent anthropogenic noise on synthetic Green's functions extracted from ambient noise recorded on a dense linear distributed acoustic sensing (DAS) array. Low-cost, low-impact urban seismic surveys are possible with DAS, which uses dynamic strain sensing to record seismic waves incident to a buried fiber optic cable. However, interferometry and tomography of ambient noise data recorded in urban areas include coherent noise from near-field infrastructure such as cars and trains passing the array, in some cases causing artifacts in estimated Green's functions and potentially incorrect surface wave velocities. Based on our comparison of several methods, we propose an automated, real-time data processing workflow to detect and reduce the impact of these events on data from a dense array in an urban environment. We utilize a recursive STA/LTA (short-term average/long-term average) algorithm on each channel to identify sharp amplitude changes typically associated with an event arrival. In order to distinguish between optical noise and physical events, an event is cataloged only if STA/LTA is triggered on enough channels across the array in a short time window. For each event in the catalog, a conventional semblance analysis is performed across a straight segment of the array to determine whether the event has a coherent velocity signature. Events that demonstrate a semblance peak at low apparent velocities (5-50 m/s) are assumed to represent coherent transportation-related noise and are down-weighted in the time domain before cross-correlation. We show the impact of removing such noise on estimated Green's functions from ambient noise data recorded in Richmond, CA in December 2014. This method has been developed for use on a continuous time-lapse ambient noise survey collected with DAS near Fairbanks, AK, and an upcoming ambient noise survey on the Stanford University campus using DAS with a re-purposed telecommunications fiber optic cable.

  16. Improving the Statistical Modeling of the TRMM Extreme Precipitation Monitoring System

    NASA Astrophysics Data System (ADS)

    Demirdjian, L.; Zhou, Y.; Huffman, G. J.

    2016-12-01

    This project improves upon an existing extreme precipitation monitoring system based on the Tropical Rainfall Measuring Mission (TRMM) daily product (3B42) using new statistical models. The proposed system utilizes a regional modeling approach, where data from similar grid locations are pooled to increase the quality and stability of the resulting model parameter estimates to compensate for the short data record. The regional frequency analysis is divided into two stages. In the first stage, the region defined by the TRMM measurements is partitioned into approximately 27,000 non-overlapping clusters using a recursive k-means clustering scheme. In the second stage, a statistical model is used to characterize the extreme precipitation events occurring in each cluster. Instead of utilizing the block-maxima approach used in the existing system, where annual maxima are fit to the Generalized Extreme Value (GEV) probability distribution at each cluster separately, the present work adopts the peak-over-threshold (POT) method of classifying points as extreme if they exceed a pre-specified threshold. Theoretical considerations motivate the use of the Generalized-Pareto (GP) distribution for fitting threshold exceedances. The fitted parameters can be used to construct simple and intuitive average recurrence interval (ARI) maps which reveal how rare a particular precipitation event is given its spatial location. The new methodology eliminates much of the random noise that was produced by the existing models due to a short data record, producing more reasonable ARI maps when compared with NOAA's long-term Climate Prediction Center (CPC) ground based observations. The resulting ARI maps can be useful for disaster preparation, warning, and management, as well as increased public awareness of the severity of precipitation events. Furthermore, the proposed methodology can be applied to various other extreme climate records.

  17. Surge Driven Return Flow Results in Deposition of Coarse Grain Horizons Archiving a 4000 Year Record of Extreme Storm Events, Cape Cod, Massachusetts

    NASA Astrophysics Data System (ADS)

    Maio, C. V.; Donnelly, J. P.; Sullivan, R.; Weidman, C. R.; Sheremet, V.

    2014-12-01

    The brevity of the instrumental record and lack of detailed historical accounts is a limiting factor in our understanding of the relationship between climate change and the frequency and intensity of extreme storm events. This study applied paleotempestologic and hydrographic methods to identify the mechanisms of storm-induced coarse grain deposition and reconstruct a late Holocene storm record within Waquoit Bay, Massachusetts. Three sediment cores (6.0 m, 8.4 m, and 8.2 m) were collected in 3 m of water using a vibracore system. Grain sizes were measured along core to identify coarse grain anomalies that serve as a proxy for past storm events. An historical age model (1620-2011 AD) was developed based on Pb pollution chronomarkers derived from X-Ray Florescence bulk Pb data, equating to a sedimentation rate of 8-8.3 mm/yr (R2 = 0.99). A long-term (4000 to 275 years before present) sedimentation rate of 1.1-1.4 mm/yr (R2 = 0.89) was calculated based on twenty-four continuous flow atomic mass spectrometry 14C ages of marine bivalves. To determine hydrographic conditions within the embayment during storm events current meters and tide gauges were deployed during Hurricane Irene (2011) which measured a storm surge of 88 cm above mean sea level. The buildup of storm water against the landward shoreline resulted in a measured 10 cm/s seaward moving bottom current capable of transporting coarse sand eroded from the adjacent shoreface into the coring site. Modeled surges for eleven modern and historic storm events ranged in height from 0.37 m (2011) to 3.72 m (1635) above mean high water. The WAQ1, WAQ2, and WAQ3 cores recorded a total of 89, 139, and 137 positive anomalies that exceeded the lower threshold and 15, 34, and 12 that exceeded the upper threshold respectively. Events recorded during the historic period coincide with documented storm events. The mean frequency within the three cores applying the lower threshold was 2.6 events per century, while applying the upper threshold was 0.44 events per century. The study has identified a previously understudied transport mechanism for the formation of storm-induced coarse grain horizons and highlighted some of the challenges to utilizing shallow water embayments as sites for storm reconstructions.

  18. Data-Driven Information Extraction from Chinese Electronic Medical Records

    PubMed Central

    Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q.

    2015-01-01

    Objective This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Materials and Methods Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. Results The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. Discussion In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). Conclusions The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica. PMID:26295801

  19. Data-Driven Information Extraction from Chinese Electronic Medical Records.

    PubMed

    Xu, Dong; Zhang, Meizhuo; Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q

    2015-01-01

    This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica.

  20. A wireless recording system that utilizes Bluetooth technology to transmit neural activity in freely moving animals

    PubMed Central

    Hampson, Robert E.; Collins, Vernell; Deadwyler, Sam A.

    2009-01-01

    A new wireless transceiver is described for recording individual neuron firing from behaving rats utilizing Bluetooth transmission technology and a processor onboard for discrimination of neuronal waveforms and associated time stamps. This universal brain activity transmitter (UBAT) is attached to rodents via a backpack and amplifier headstage and can transmit 16 channels of captured neuronal firing data via a Bluetooth transceiver chip over very large and unconstrained distances. The onboard microprocessor of the UBAT allows flexible online control over waveform isolation criteria via transceiver instruction and the two-way communication capacity allows for closed-loop applications between neural events and behavioral or physiological processes which can be modified by transceiver instructions. A detailed description of the multiplexer processing of channel data as well as examples of neuronal recordings in different behavioral testing contexts is provided to demonstrate the capacity for robust transmission within almost any laboratory environment. A major advantage of the UBAT is the long transmission range and lack of object-based line of sight interference afforded by Bluetooth technology, allowing flexible recording capabilities within multiple experimental paradigms without interruption. Continuous recordings over very large distance separations from the monitor station are demonstrated providing experimenters with recording advantages not previously available with other telemetry devices. PMID:19524612

  1. A wireless recording system that utilizes Bluetooth technology to transmit neural activity in freely moving animals.

    PubMed

    Hampson, Robert E; Collins, Vernell; Deadwyler, Sam A

    2009-09-15

    A new wireless transceiver is described for recording individual neuron firing from behaving rats utilizing Bluetooth transmission technology and a processor onboard for discrimination of neuronal waveforms and associated time stamps. This universal brain activity transmitter (UBAT) is attached to rodents via a backpack and amplifier headstage and can transmit 16 channels of captured neuronal firing data via a Bluetooth transceiver chip over very large and unconstrained distances. The onboard microprocessor of the UBAT allows flexible online control over waveform isolation criteria via transceiver instruction and the two-way communication capacity allows for closed-loop applications between neural events and behavioral or physiological processes which can be modified by transceiver instructions. A detailed description of the multiplexer processing of channel data as well as examples of neuronal recordings in different behavioral testing contexts is provided to demonstrate the capacity for robust transmission within almost any laboratory environment. A major advantage of the UBAT is the long transmission range and lack of object-based line of sight interference afforded by Bluetooth technology, allowing flexible recording capabilities within multiple experimental paradigms without interruption. Continuous recordings over very large distance separations from the monitor station are demonstrated providing experimenters with recording advantages not previously available with other telemetry devices.

  2. Uranium isotope evidence for two episodes of deoxygenation during Oceanic Anoxic Event 2

    NASA Astrophysics Data System (ADS)

    Clarkson, Matthew O.; Stirling, Claudine H.; Jenkyns, Hugh C.; Dickson, Alexander J.; Porcelli, Don; Moy, Christopher M.; Pogge von Strandmann, Philip A. E.; Cooke, Ilsa R.; Lenton, Timothy M.

    2018-03-01

    Oceanic Anoxic Event 2 (OAE 2), occurring ˜94 million years ago, was one of the most extreme carbon cycle and climatic perturbations of the Phanerozoic Eon. It was typified by a rapid rise in atmospheric CO2, global warming, and marine anoxia, leading to the widespread devastation of marine ecosystems. However, the precise timing and extent to which oceanic anoxic conditions expanded during OAE 2 remains unresolved. We present a record of global ocean redox changes during OAE 2 using a combined geochemical and carbon cycle modeling approach. We utilize a continuous, high-resolution record of uranium isotopes in pelagic and platform carbonate sediments to quantify the global extent of seafloor anoxia during OAE 2. This dataset is then compared with a dynamic model of the coupled global carbon, phosphorus, and uranium cycles to test hypotheses for OAE 2 initiation. This unique approach highlights an intra-OAE complexity that has previously been underconstrained, characterized by two expansions of anoxia separated by an episode of globally significant reoxygenation coincident with the “Plenus Cold Event.” Each anoxic expansion event was likely driven by rapid atmospheric CO2 injections from multiphase Large Igneous Province activity.

  3. Holocene turbidite and onshore paleoseismic record of great earthquakes on the Cascadia Subduction Zone: relevance for the Sumatra 2004 Great Earthquake

    NASA Astrophysics Data System (ADS)

    Gutierrez-Pastor, J.; Nelson, C. H.; Goldfinger, C.; Johnson, J.

    2005-05-01

    Marine turbidite stratigraphy, onshore paleoseismic records of tsunami sand beds and co-seismic subsidence (Atwater and Hemphill-Haley, 1997; Kelsey et al., 2002; Witter et al., 2003) and tsunami sands of Japan (Satake et al., 1996) all show evidence for great earthquakes (M ~ 9) on the Cascadia Subduction Zone. When a great earthquake shakes 1000 kilometers of the Cascadia margin, sediment failures occur in all tributary canyons and resulting turbidity currents travel down the canyon systems and deposit synchronous turbidites in abyssal seafloor channels. These turbidite records provide a deepwater paleoseismic record of great earthquakes. An onshore paleoseismic record develops from rapid coseismic subsidence resulting in buried marshes and drowned forests, and subsequent tsunami sand layer deposition. The Cascadia Basin provides the longest paleoseismic record of great earthquakes that is presently available for a subduction zone. A total of 17 synchronous turbidites have deposited along ~700 km of the Cascadia margin during the Holocene time of ~10,000 cal yr. Because the youngest paleoseismic event in all turbidite and onshore records is 300 AD, the average recurrence interval of Great Earthquakes is ~ 600 yr. At least 6 smaller events have also ruptured shorter margin segments. Linkage of the rupture length of these events comes from relative dating tools such as the "confluence test" of Adams (1990), radiocarbon ages of onshore and offshore events and physical property correlation of individual event "signatures". We use both 14C ages and analysis of hemipelagic sediment thickness between turbidites (H), where H/sedimentation rate = time between turbidite events to develop two recurrence histories. Utilizing the most reliable 14C and hemipelagic data sets from turbidites for the past ~ 5000 yr, the minimum recurrence time is ~ 300 yr and maximum time is ~ 1300 yr. There also is a recurrence pattern through the entire Holocene that consists of a long time interval followed by 2 to 5 short intervals that is apparent as well in the land records. This pattern has repeated five times in the Holocene. Both onshore paleoseismic records and turbidite synchroneity for hundreds of kilometers, suggest that the Holocene turbidite record of the Cascadia Subduction Zone is caused dominantly by triggering of great earthquakes similar in rupture length to the Sumatra 2004 earthquake. The recent Sumatra subduction zone great earthquake of 2004 and the 1700 AD Cascadia tsunami sand of 3m height preserved in Japan (Satake et al., 1996) show that ocean-basin wide tsunami deposits result from these great earthquakes, which rupture the seafloor for hundreds of kilometers. Cascadia and Sumatra share many geological and physiographic similarities that favor the deposition of turbidites from great earthquakes, and tend to filter non earthquake turbidites from the record. Thus the paleoseismic methods developed in Cascadia could be applied to the Sumatran Subduction Zone and we expect that the turbidite record would yield a similar record ~10,000 yr in length. In Sumatra, the dearth of such records led to the lack of widespread recognition of the hazard, particularly from the northern Sumatra and Andaman-Nicobar region where geodetic data suggested weak plate locking. Evidence of a tsunami similar to the 2004 event from satellite imagery suggests the previous event was in the recent past.

  4. Hyperchromatic lens for recording time-resolved phenomena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frayer, Daniel K.

    A method and apparatus for the capture of a high number of quasi-continuous effective frames of 2-D data from an event at very short time scales (from less than 10.sup.-12 to more than 10.sup.-8 seconds) is disclosed which allows for short recording windows and effective number of frames. Active illumination, from a chirped laser pulse directed to the event creates a reflection where wavelength is dependent upon time and spatial position is utilized to encode temporal phenomena onto wavelength. A hyperchromatic lens system receives the reflection and maps wavelength onto axial position. An image capture device, such as holography ormore » plenoptic imaging device, captures the resultant focal stack from the hyperchromatic lens system in both spatial (imaging) and longitudinal (temporal) axes. The hyperchromatic lens system incorporates a combination of diffractive and refractive components to maximally separate focal position as a function of wavelength.« less

  5. Events at blood collection area due to nonconforming blood bags and plateletpheresis kits: need for timely corrective and preventive actions.

    PubMed

    Verma, Anupam; Sachan, Deepti; Elhence, Priti; Pandey, Hem; Dubey, Anju

    2012-07-01

    Good blood banking practice requires that every effort should be made to detect any deviation or defect in blood bank products and to identify any potential risk to blood donor or recipient(s). We report the findings of an exercise that provide an insight into why feedback from the user side is crucial. Various events involving blood bags and plateletpheresis kits and the corresponding appropriate actions instituted for remedial measures were recorded. These scattered events were recorded for 6 months following the use of a new batch of improved blood bags with add-on features. Several events related to plateletpheresis kits from three different manufacturers were also recorded for 1 year. The affected blood bags were utilized with no untoward incident. The complaint was closed following satisfactory response from the blood bag manufacturing company that acted in a timely manner in addressing the root causes of the problems. However, corrective and preventive actions (CAPA) could not be implemented for plateletpheresis kits. The rate of undesirable events was higher with plateletpheresis kits as compared with whole blood bags (1.75% vs. 0.06%). As defects or deviations that trigger the need for CAPA can stem from numerous sources, it is important to clearly identify and document the problems and level of risk so that appropriate investigations can be instituted and remedial actions can be taken in a timely manner. This study demonstrates the usefulness of a quality initiative to collate and analyze blood product faults in conjunction with blood product manufacturers. © 2012 American Association of Blood Banks.

  6. Key Design Elements of a Data Utility for National Biosurveillance: Event-driven Architecture, Caching, and Web Service Model

    PubMed Central

    Tsui, Fu-Chiang; Espino, Jeremy U.; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M.

    2005-01-01

    The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance—systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions. PMID:16779138

  7. Multiple channel coincidence detector and controller for microseismic data analysis

    DOEpatents

    Fasching, George E.

    1976-11-16

    A multiple channel coincidence detector circuit is provided for analyzing data either in real time or recorded data on a magnetic tape during an experiment for determining location and progression of fractures in an oil field or the like while water is being injected at high pressure in wells located in the field. The circuit is based upon the utilization of a set of parity generator trees combined with monostable multivibrators to detect the occurrence of two events at any pair of channel input terminals that are within a preselected time frame and have an amplitude above a preselected magnitude. The parity generators perform an exclusive OR function in a timing circuit composed of monostable multivibrators that serve to yield an output when two events are present in the preselected time frame. Any coincidences falling outside this time frame are considered either noise or not otherwise useful in the analysis of the recorded data. Input pulses of absolute magnitude below the low-level threshold setting of a bipolar low-level threshold detector are unwanted and therefore rejected. A control output is provided for a utilization device from a coincidence hold circuit that may be used to halt a tape search unit at the time of coincidence or perform other useful control functions.

  8. Berkeley Seismological Laboratory Seismic Moment Tensor Report for the August 6, 2007 M3.9 Seismic event in central Utah

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S; Dreger, D; Hellweg, P

    2007-08-08

    We have performed a complete moment tensor analysis of the seismic event, which occurred on Monday August 6, 2007 at 08:48:40 UTC 21 km from Mt.Pleasant, Utah. In our analysis we utilized complete three-component seismic records recorded by the USArray, University of Utah, and EarthScope seismic arrays. The seismic waveform data was integrated to displacement and filtered between 0.02 to 0.10 Hz following instrument removal. We used the Song et al. (1996) velocity model to compute Green's functions used in the moment tensor inversion. A map of the stations we used and the location of the event is shown inmore » Figure 1. In our moment tensor analysis we assumed a shallow source depth of 1 km consistent with the shallow depth reported for this event. As shown in Figure 2 the results point to a source mechanism with negligible double-couple radiation and is composed of dominant CLVD and implosive isotropic components. The total scalar seismic moment is 2.12e22 dyne cm corresponding to a moment magnitude (Mw) of 4.2. The long-period records are very well matched by the model (Figure 2) with a variance reduction of 73.4%. An all dilational (down) first motion radiation pattern is predicted by the moment tensor solution, and observations of first motions are in agreement.« less

  9. Sediment Relative Paleointensity Record With Slow-sedimentation Rates: Implication For a Chronological Tool In The Slow-sedimentation Sequence

    NASA Astrophysics Data System (ADS)

    Kanamatsu, T.

    2006-12-01

    Usefulness of paleointensity records with high-sedimentation rates in stratigraphic correlation have been proved (e.g. Stoner et al., 1998, Laj et al., 2000, Stoner et al., 2000), because the sediment geomagnetic paleointensity data makes possible the fine time correlation between cores on the older sediment than the range of AMS 14C. As father application of the sediment paleointensity for chronological tool, we examined the paleointensity record of much slower sedimentation rate. The paleointensity record of the slower sedimentation sequence is supposed to show the convoluted record by the filtering effect of the post- depositional remanent magnetization, then a unique and different pattern depending on the sedimentation rate (e.g. Guyodo and Channell, 2002). We studied the record of the cores obtained from the West Philippine Sea Basin (Water depth ca. 5000 to 6000 m). The analyses of paleomagnetic direction proved that the cores contain Jaramillo and Olduvai Events. The sedimentation rates of cores estimated from magnetostratigraphy are less than 1cm/kyr (0.6-0.4 cm/kyr). Proxy of paleointensity (NRM20mT/ARM20mT) applied to cores reveals the variations in the records are dominate in c.a. 100 ky cycle. Comparing to other published paleointensity record, it is clear that the record includes ca.100-ky cycle in spite of slower sedimentation rates, although other high frequency records were not identified. It is suggests that geomagnetic events of a few to several kys are recordable in the sediment. The paleointensity in the slow-sedimentation record is still useful for the age control utilizing the lower frequency signal, especially for investigating of less age information sequence such as the deep sea sediment below CCD, but not for fine correlation by high frequency data.

  10. 49 CFR 229.135 - Event recorders.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... an event recorder with a certified crashworthy event recorder memory module that meets the... certified crashworthy event recorder memory module that meets the requirements of Appendix D of this part. The certified event recorder memory module shall be mounted for its maximum protection. (Although...

  11. Event Reconstruction Techniques in NOvA

    NASA Astrophysics Data System (ADS)

    Baird, M.; Bian, J.; Messier, M.; Niner, E.; Rocco, D.; Sachdev, K.

    2015-12-01

    The NOvA experiment is a long-baseline neutrino oscillation experiment utilizing the NuMI beam generated at Fermilab. The experiment will measure the oscillations within a muon neutrino beam in a 300 ton Near Detector located underground at Fermilab and a functionally-identical 14 kiloton Far Detector placed 810 km away. The detectors are liquid scintillator tracking calorimeters with a fine-grained cellular structure that provides a wealth of information for separating the different particle track and shower topologies. Each detector has its own challenges with the Near Detector seeing multiple overlapping neutrino interactions in each event and the Far Detector having a large background of cosmic rays due to being located on the surface. A series of pattern recognition techniques have been developed to go from event records, to spatially and temporally separating individual interactions, to vertexing and tracking, and particle identification. This combination of methods to achieve the full event reconstruction will be discussed.

  12. Characterizing the variability of benthic foraminifera in the northeastern Gulf of Mexico following the Deepwater Horizon event (2010-2012).

    PubMed

    Schwing, P T; O'Malley, B J; Romero, I C; Martínez-Colón, M; Hastings, D W; Glabach, M A; Hladky, E M; Greco, A; Hollander, D J

    2017-01-01

    Following the Deepwater Horizon (DWH) event in 2010 subsurface hydrocarbon intrusions (1000-1300 m) and an order of magnitude increase in flocculent hydrocarbon deposition caused increased concentrations of hydrocarbons in continental slope sediments. This study sought to characterize the variability [density, Fisher's alpha (S), equitability (E), Shannon (H)] of benthic foraminifera following the DWH event. A series of sediment cores were collected at two sites in the northeastern Gulf of Mexico from 2010 to 2012. At each site, three cores were utilized for benthic faunal analysis, organic geochemistry, and redox metal chemistry, respectively. The surface intervals (∼0-10 mm) of the sedimentary records collected in December 2010 at DSH08 and February 2011 at PCB06 were characterized by significant decreases in foraminiferal density, S, E, and H, relative to the down-core intervals as well as previous surveys. Non-metric multidimensional scaling (nMDS) analysis suggested that a 3-fold increase in polycyclic aromatic hydrocarbon (PAH) concentration in the surface interval, relative to the down-core interval, was the environmental driver of benthic foraminiferal variability. These records suggested that the benthic foraminiferal recovery time, following an event such as the DWH, was on the order of 1-2 years.

  13. Signatures of cosmic-ray increase attributed to exceptional solar storms inferred from multiple cosmogenic radionuclide records

    NASA Astrophysics Data System (ADS)

    Mekhaldi, Florian; Muscheler, Raimund; Adolphi, Florian; Svensson, Anders; Aldahan, Ala; Possnert, Göran; McConnell, Joseph R.; Sigl, Michael; Welten, Kees C.; Woodruff, Thomas E.

    2014-05-01

    Miyake et al. (2012, 2013) discovered rapid increases of 14C content in tree rings dated to AD 774-5 and AD 993-4 which they have attributed to cosmic-ray events. These extreme particle events have no counterparts in the instrumental record and have been tentatively associated with solar proton events, supernovae and short gamma-ray bursts, which have very different energy spectra. Cosmogenic radionuclides such as 14C, 10Be and 36Cl arise from the interaction of cosmic rays with atmospheric nitrogen, oxygen and argon. These radio-isotopes are produced through different reaction pathways and vary with different energy dependencies of the production rate cross section. Owing to this, yield functions can be used to determine the energy level of incident particles. However, only 14C has been measured at high resolution to quantify the energy and thus the origin of the outbursts. We present an annually resolved record of 10Be from the NGRIP ice core for the two events. In addition, we also utilized the GRIP ice core 36Cl record in our analysis. Our results show that the differential production of cosmogenic 14C, 10Be and 36Cl is consistent with a solar energy spectrum. Considering the notable increase in radionuclides, the solar storms would have had to be substantially greater than the largest recorded geomagnetic storm, the so-called Carrington event. This challenges our understanding of the sun's dynamics. Furthermore, the events could possibly be of interest for the investigation of potential cosmic ray-cloud linkages (Svensmark & Friis-Christensen, 1997). Alternatively, such outbursts of energetic particles have the potential to deplete atmospheric ozone and alter atmospheric circulation. Ultimately, the magnitude of such particle events draws attention to the perhaps underestimated potential of the sun to cause great damage to modern technologies. References Miyake, F., Masuda, K. & Nakamura, T. Another rapid event in the carbon-14 content of tree rings. Nature Communications 4:1748, DOI: 10.1038/ncomms2783 (2013). Miyake, F., Nagaya, K., Masuda, K. & Nakamura, T. A signature of cosmic-ray increase in AD 774-775 from tree rings in Japan. Nature 486, 240-242, DOI: 210.1038/nature11123 (2012). Svensmark, H., & Friis-Christensen, E. Variation of cosmic ray flux and global cloud coverage - A missing link in solar-climate relationships. J. Atmos. Sol., Terr. Phys., 59, 225-1232, DOI: 10.1029/1998JD200091 (1997).

  14. Calibrating a Method for Reconstructing ENSO Variance in the Eastern Tropical Pacific Using Mg/Ca in Individual Planktic Foraminifera

    NASA Astrophysics Data System (ADS)

    Rongstad, B.; Marchitto, T. M., Jr.; Koutavas, A.; Mekik, F.

    2017-12-01

    El Niño Southern Oscillation (ENSO) is Earth's dominant mode of interannual climate variability, and is responsible for widespread climatic, ecological and societal impacts, such as reduced upwelling and fishery collapse in the eastern equatorial Pacific during El Niño events. While corals offer high resolution records of paleo-ENSO, continuous and gap-free records for the tropical Pacific are rare. Individual foraminifera analyses provide an opportunity to create continuous down-core records of ENSO through the construction and comparison of species-specific sea surface temperature (SST) distributions at different time periods; however, there has been little focus on calibrating this technique to modern ENSO conditions. Here, we present data from a core-top calibration of individual Mg/Ca measurements in planktic foraminifera in the eastern tropical Pacific, using surface dweller G. ruber and thermocline dweller N. dutertrei. We convert the individual Mg/Ca measurements to inferred temperature distributions for each species, and then compare the distributions to modern day temperature characteristics including vertical structure, annual mean, seasonality, and interannual variability. ENSO variance is theoretically inferred from the tails of the distributions: El Niño events affect the warm tail and La Niña events affect the cool tail. Finally, we discuss the utility of individual measurements of Mg/Ca in planktic foraminifera to reconstruct ENSO in down-core sections.

  15. DAS Microseismic and Strain Monitoring During Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Kahn, D.; Karrenbach, M. H.; Cole, S.; Boone, K.; Ridge, A.; Rich, J.; Langton, D.; Silver, K.

    2017-12-01

    Hydraulic fracturing operations in unconventional subsurface reservoirs are typically monitored using geophones located either at the surface or in adjacent wellbores. A novel approach to record hydraulic stimulations utilizes fiber-optic Distributed Acoustic Sensing (DAS). A fiber-optic cable was installed in a treatment well in a subsurface reservoir (Meramec formation). DAS data were recorded during fluid injection of same fibered well and also during injection into a nearby treatment well at a distance of 350m. For both scenarios the DAS sensing array consisted of approximately 1000 channels at a fine spatial and temporal sampling and with a large sensing aperture. Thus, the full strain wave field is measured along the borehole over its entire length. A variety of physical effects, such as temperature, low-frequency strain and microseismicity were measured and correlated with the treatment program during hydraulic fracturing of the wells. These physical effects occur at various frequency scales and produce complementary measurements. Microseismic events in the magnitude range of -0.5 and -2.0 at a maximum distance of 500m were observed and analyzed for recordings from the fiber-equipped treatment well and also neighboring treatment well. The analysis of this DAS data set demonstrates that current fiber-optic sensing technology can provide enough sensitivity to detect a significant number of microseismic events and that these events can be integrated with temperature and strain measurements for an improved subsurface reservoir description.

  16. Gift-giving and network structure in rural China: utilizing long-term spontaneous gift records.

    PubMed

    Chen, Xi

    2014-01-01

    The tradition of keeping written records of gift received during household ceremonies in many countries offers researchers an underutilized means of data collection for social network analysis. This paper first summarizes unique features of the gift record data that circumvent five prevailing sampling and measurement issues in the literature, and we discuss their advantages over existing studies at both the individual level and the dyadic link level using previous data sources. We then document our research project in rural China that implements a multiple wave census-type household survey and a long-term gift record collection. The pattern of gift-giving in major household social events and its recent escalation is analyzed. There are significantly positive correlations between gift network centrality and various forms of informal insurance. Finally, economic inequality and competitive marriage market are among the main demographic and socioeconomic determinants of the observed gift network structure.

  17. Gift-Giving and Network Structure in Rural China: Utilizing Long-Term Spontaneous Gift Records

    PubMed Central

    Chen, Xi

    2014-01-01

    The tradition of keeping written records of gift received during household ceremonies in many countries offers researchers an underutilized means of data collection for social network analysis. This paper first summarizes unique features of the gift record data that circumvent five prevailing sampling and measurement issues in the literature, and we discuss their advantages over existing studies at both the individual level and the dyadic link level using previous data sources. We then document our research project in rural China that implements a multiple wave census-type household survey and a long-term gift record collection. The pattern of gift-giving in major household social events and its recent escalation is analyzed. There are significantly positive correlations between gift network centrality and various forms of informal insurance. Finally, economic inequality and competitive marriage market are among the main demographic and socioeconomic determinants of the observed gift network structure. PMID:25111696

  18. Recognizing explosion sites with a self-organizing network for unsupervised learning

    NASA Astrophysics Data System (ADS)

    Tarvainen, Matti

    1999-06-01

    A self-organizing neural network model has been developed for identifying mining explosion locations in different environments in Finland and adjacent areas. The main advantage of the method is its ability to automatically find a suitable network structure and naturally correctly identify explosions as such. The explosion site recognition was done using extracted waveform attributes of various kind event records from the small-aperture array FINESS in Finland. The recognition was done by using P-S phase arrival differences and rough azimuth estimates to provide a first robust epicentre location. This, in turn, leads to correct mining district identification where more detailed tuning was performed using different phase amplitude and signal-to-noise attributes. The explosions studied here originated in mines and quarries located in Finland, coast of Estonia and in the St. Petersburg area, Russia. Although the Helsinki bulletins in 1995 and 1996 listed 1649 events in these areas, analysis was restricted to the 380 (ML≥2) events which, besides, were found in the reviewed event bulletins (REB) of the CTBTO/UN prototype international data centre (pIDC) in Arlington, VA, USA. These 380 events with different attributes were selected for the learning stage. Because no `ground-truth' information was available the corresponding mining, `code' coordinates used earlier to compile Helsinki bulletins were utilized instead. The novel self-organizing method was tested on 18 new event recordings in the mentioned area in January-February 1997, out of which 15 were connected to correct mines. The misconnected three events were those which did not have all matching attributes in the self-organizing maps (SOMs) network.

  19. HIGH SPEED KERR CELL FRAMING CAMERA

    DOEpatents

    Goss, W.C.; Gilley, L.F.

    1964-01-01

    The present invention relates to a high speed camera utilizing a Kerr cell shutter and a novel optical delay system having no moving parts. The camera can selectively photograph at least 6 frames within 9 x 10/sup -8/ seconds during any such time interval of an occurring event. The invention utilizes particularly an optical system which views and transmits 6 images of an event to a multi-channeled optical delay relay system. The delay relay system has optical paths of successively increased length in whole multiples of the first channel optical path length, into which optical paths the 6 images are transmitted. The successively delayed images are accepted from the exit of the delay relay system by an optical image focusing means, which in turn directs the images into a Kerr cell shutter disposed to intercept the image paths. A camera is disposed to simultaneously view and record the 6 images during a single exposure of the Kerr cell shutter. (AEC)

  20. Super instrumental El Niño events recorded by a Porites coral from the South China Sea

    NASA Astrophysics Data System (ADS)

    Wang, Xijie; Deng, Wenfeng; Liu, Xi; Wei, Gangjian; Chen, Xuefei; Zhao, Jian-xin; Cai, Guanqiang; Zeng, Ti

    2018-03-01

    The 2-7-year periodicities recorded in fossil coral records have been widely used to identify paleo-El Niño events. However, the reliability of this approach in the South China Sea (SCS) has not been assessed in detail. Therefore, this paper presents monthly resolution geochemical records covering the period 1978-2015 obtained from a Porites coral recovered from the SCS to test the reliability of this method. The results suggest that the SCS coral reliably recorded local seawater conditions and the super El Niño events that occurred over the past 3 decades, but does not appear to have been sensitive enough to record all the other El Niños. In detail, the Sr/Ca series distinctly documents only the two super El Niños of 1997-1998 and 2014-2016 as obvious low values, but does not match the Oceanic Niño Index well. The super El Niño of 1982-1983 was identified by the growth hiatus caused by the coral bleaching and subsequent death of the coral. Three distinct stepwise variations occur in the δ13C series that are coincident with the three super El Niños, which may be related to a substantial decline in endosymbiotic zooxanthellae density caused by the increase in temperature during an El Niño or the selective utilization of different zooxanthellaes that was required to survive in the extreme environment. The increase in rainfall and temperatures over the SCS during El Niños counteracts the effects on seawater δ18O (δ18Osw) and salinity; consequently, coral Δδ18O series can be used as a proxy for δ18Osw and salinity, but are not appropriate for identifying El Niño activity. The findings presented here suggest that the method to identify paleo-El Niño activity based on the 2-7-year periodicities preserved in the SCS coral records might not be reliable, because the SCS is on the edge of El Niño anomalies due to its great distance from the central equatorial Pacific and the imprints of weak and medium strength El Niño events may not be recorded by the corals there.

  1. A piezoelectric film-based intrasplint detection method for bruxism.

    PubMed

    Takeuchi, H; Ikeda, T; Clark, G T

    2001-08-01

    An accurate, easy-to-use, long-term method other than EMG is needed to monitor bruxism. This article presents pilot data on the reproducibility, validity, and utility of an intrasplint piezoelectric film method. Simulated bruxism behaviors (steady-state and rhythmic clenching, grinding, and tapping) in 5 subjects were recorded with the use of both masseter EMG and an intrasplint piezoelectric film method. Correlation coefficients calculated for simulated bruxism event duration with the use of a masseter EMG or an intrasplint piezoelectric film method were 0.99 for tapping and steady-state clenching, 0.96 for rhythmic clenching, and 0.79 for grinding. Piezoelectric film has its limitations and does not faithfully capture sustained force magnitudes. However, for the target behaviors associated with bruxism (tooth grinding, clenching, and tapping), it appears to faithfully reproduce above-baseline events with durations statistically indistinguishable from those recorded with masseter EMG. Masseter EMG was poorest at detecting a simulated side-to-side grinding behavior.

  2. A marked point process approach for identifying neural correlates of tics in Tourette Syndrome.

    PubMed

    Loza, Carlos A; Shute, Jonathan B; Principe, Jose C; Okun, Michael S; Gunduz, Aysegul

    2017-07-01

    We propose a novel interpretation of local field potentials (LFP) based on a marked point process (MPP) framework that models relevant neuromodulations as shifted weighted versions of prototypical temporal patterns. Particularly, the MPP samples are categorized according to the well known oscillatory rhythms of the brain in an effort to elucidate spectrally specific behavioral correlates. The result is a transient model for LFP. We exploit data-driven techniques to fully estimate the model parameters with the added feature of exceptional temporal resolution of the resulting events. We utilize the learned features in the alpha and beta bands to assess correlations to tic events in patients with Tourette Syndrome (TS). The final results show stronger coupling between LFP recorded from the centromedian-paraficicular complex of the thalamus and the tic marks, in comparison to electrocorticogram (ECoG) recordings from the hand area of the primary motor cortex (M1) in terms of the area under the curve (AUC) of the receiver operating characteristic (ROC) curve.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  4. Dynamic Creation of Social Networks for Syndromic Surveillance Using Information Fusion

    NASA Astrophysics Data System (ADS)

    Holsopple, Jared; Yang, Shanchieh; Sudit, Moises; Stotz, Adam

    To enhance the effectiveness of health care, many medical institutions have started transitioning to electronic health and medical records and sharing these records between institutions. The large amount of complex and diverse data makes it difficult to identify and track relationships and trends, such as disease outbreaks, from the data points. INFERD: Information Fusion Engine for Real-Time Decision-Making is an information fusion tool that dynamically correlates and tracks event progressions. This paper presents a methodology that utilizes the efficient and flexible structure of INFERD to create social networks representing progressions of disease outbreaks. Individual symptoms are treated as features allowing multiple hypothesis being tracked and analyzed for effective and comprehensive syndromic surveillance.

  5. Accessibility assessment of Houston's roadway network during Harvey through integration of observed flood impacts and hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Gidaris, I.; Gori, A.; Panakkal, P.; Padgett, J.; Bedient, P. B.

    2017-12-01

    The record-breaking rainfall produced over the Houston region by Hurricane Harvey resulted in catastrophic and unprecedented impacts on the region's infrastructure. Notably, Houston's transportation network was crippled, with almost every major highway flooded during the five-day event. Entire neighborhoods and subdivisions were inundated, rendering them completely inaccessible to rescue crews and emergency services. Harvey has tragically highlighted the vulnerability of major thoroughfares, as well as neighborhood roads, to severe inundation during extreme precipitation events. Furthermore, it has emphasized the need for detailed accessibility characterization of road networks under extreme event scenarios in order to determine which areas of the city are most vulnerable. This analysis assesses and tracks the accessibility of Houston's major highways during Harvey's evolution by utilizing road flood/closure data from the Texas DOT. In the absence of flooded/closure data for local roads, a hybrid approach is adopted that utilizes a physics-based hydrologic model to produce high-resolution inundation estimates for selected urban watersheds in the Houston area. In particular, hydrologic output in the form of inundation depths is used to estimate the operability of local roads. Ultimately, integration of hydrologic-based estimation of road conditions with observed data from DOT supports a network accessibility analysis of selected urban neighborhoods. This accessibility analysis can identify operable routes for emergency response (rescue crews, medical services, etc.) during the storm event.

  6. CIFAR10-DVS: An Event-Stream Dataset for Object Classification

    PubMed Central

    Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping

    2017-01-01

    Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as “CIFAR10-DVS.” The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification. PMID:28611582

  7. CIFAR10-DVS: An Event-Stream Dataset for Object Classification.

    PubMed

    Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping

    2017-01-01

    Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as "CIFAR10-DVS." The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification.

  8. The global event system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winans, J.

    The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.

  9. Infrasonic Detection of a Large Bolide over South Sulawesi, Indonesia on October 8, 2009: Preliminary Results

    NASA Technical Reports Server (NTRS)

    Silber, E. A.; Brown, P. G.; Le Pinchon, A.

    2011-01-01

    In the morning hours of October 8, 2009, a bright object entered Earth's atmosphere over South Sulawesi, Indonesia. This bolide disintegrated above the ground, generating stratospheric infrasound returns that were detected by infrasonic stations of the global International Monitoring System (IMS) Network of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) at distances up to 17 500 km. Here we present instrumental recordings and preliminary results of this extraordinary event. Using the infrasonic period-yield relations, originally derived for atmospheric nuclear detonations, we find the most probable source energy for this bolide to be 70+/-20 kt TNT equivalent explosive yield. A unique aspect of this event is the fact that it was apparently detected by infrasound only. Global events of such magnitude are expected only once per decade and can be utilized to calibrate infrasonic location and propagation tools on a global scale, and to evaluate energy yield formula, and event timing.

  10. Automated Sensor Tuning for Seismic Event Detection at a Carbon Capture, Utilization, and Storage Site, Farnsworth Unit, Ochiltree County, Texas

    NASA Astrophysics Data System (ADS)

    Ziegler, A.; Balch, R. S.; Knox, H. A.; Van Wijk, J. W.; Draelos, T.; Peterson, M. G.

    2016-12-01

    We present results (e.g. seismic detections and STA/LTA detection parameters) from a continuous downhole seismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project. Specifically, we evaluate data from a passive vertical monitoring array consisting of 16 levels of 3-component 15Hz geophones installed in the field and continuously recording since January 2014. This detection database is directly compared to ancillary data (i.e. wellbore pressure) to determine if there is any relationship between seismic observables and CO2 injection and pressure maintenance in the field. Of particular interest is detection of relatively low-amplitude signals constituting long-period long-duration (LPLD) events that may be associated with slow shear-slip analogous to low frequency tectonic tremor. While this category of seismic event provides great insight into dynamic behavior of the pressurized subsurface, it is inherently difficult to detect. To automatically detect seismic events using effective data processing parameters, an automated sensor tuning (AST) algorithm developed by Sandia National Laboratories is being utilized. AST exploits ideas from neuro-dynamic programming (reinforcement learning) to automatically self-tune and determine optimal detection parameter settings. AST adapts in near real-time to changing conditions and automatically self-tune a signal detector to identify (detect) only signals from events of interest, leading to a reduction in the number of missed legitimate event detections and the number of false event detections. Funding for this project is provided by the U.S. Department of Energy's (DOE) National Energy Technology Laboratory (NETL) through the Southwest Regional Partnership on Carbon Sequestration (SWP) under Award No. DE-FC26-05NT42591. Additional support has been provided by site operator Chaparral Energy, L.L.C. and Schlumberger Carbon Services. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  11. On the Information Content of Program Traces

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Program traces are used for analysis of program performance, memory utilization, and communications as well as for program debugging. The trace contains records of execution events generated by monitoring units inserted into the program. The trace size limits the resolution of execution events and restricts the user's ability to analyze the program execution. We present a study of the information content of program traces and develop a coding scheme which reduces the trace size to the limit given by the trace entropy. We apply the coding to the traces of AIMS instrumented programs executed on the IBM SPA and the SCSI Power Challenge and compare it with other coding methods. Our technique shows size of the trace can be reduced by more than a factor of 5.

  12. Monitoring of the Cowpea Bruchid, Callosobruchus maculatus (Coleoptera: Bruchidae), Feeding Activity in Cowpea Seeds: Advances in Sensing Technologies Reveals New Insights.

    PubMed

    Bittner, James A; Balfe, Susan; Pittendrigh, Barry R; Popovics, John S

    2018-05-28

    Cowpea provides a significant source of protein for over 200 million people in Sub-Saharan Africa. The cowpea bruchid, Callosobruchus maculatus (F) (Coleoptera: Bruchidae), is a major pest of cowpea as the larval stage attacks stored cowpea grains, causing postharvest loss. Cowpea bruchid larvae spend all their time feeding within the cowpea seed. Past research findings, published over 25 yr ago, have shown that feeding activity of several bruchids within a cowpea seed emit mechanical vibrations within the frequency range 5-75 kHz. This work led to the development of monitoring technologies that are both important for basic research and practical application. Here, we use newer and significantly improved technologies to re-explore the nature of the vibration signals produced by an individual C. maculatus, when it feeds in cowpea seeds. Utilizing broadband frequency sensing, individual fourth-instar bruchid larvae feeding activities (vibration events) were recorded to identify specific key emission frequencies. Verification of recorded events and association to actual feeding activities was achieved through mass measurements over 24 h for a series of replicates. The measurements identified variable peak event emission frequencies across the replicate sample set ranging in frequency from 16.4 to 26.5 kHz. A positive correlation between the number of events recorded and the measured mass loss of the cowpea seed was observed. The procedure and verification reported in this work provide an improved basis for laboratory-based monitoring of single larval feeding. From the rich dataset captured, additional analysis can be carried out to identify new key variables of hidden bruchid larval activity.

  13. Numerical modeling of an intense precipitation event and its associated lightning activity over northern Greece

    NASA Astrophysics Data System (ADS)

    Pytharoulis, I.; Kotsopoulos, S.; Tegoulias, I.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2016-03-01

    This study investigates an intense precipitation event and its lightning activity that affected northern Greece and primarily Thessaloniki on 15 July 2014. The precipitation measurement of 98.5 mm in 15 h at the Aristotle University of Thessaloniki set a new absolute record maximum. The thermodynamic analysis indicated that the event took place in an environment that could support deep thunderstorm activity. The development of this intense event was associated with significant low-level convergence and upper-level divergence even before its triggering and a positive vertical gradient of relative vorticity advection. The high resolution (1.667 km × 1.667 km) non-hydrostatic WRF-ARW numerical weather prediction model was used to simulate this intense precipitation event, while the Lightning Potential Index was utilized to calculate the potential for lightning activity. Sensitivity experiments suggested that although the strong synoptic forcing assumed primary role in the occurrence of intense precipitation and lightning activity, their spatiotemporal variability was affected by topography. The application of the very fine resolution topography of NASA Shuttle Radar Topographic Mission improved the simulated precipitation and the calculated lightning potential.

  14. Managing Expectations: Results from Case Studies of US Water Utilities on Preparing for, Coping with, and Adapting to Extreme Events

    NASA Astrophysics Data System (ADS)

    Beller-Simms, N.; Metchis, K.

    2014-12-01

    Water utilities, reeling from increased impacts of successive extreme events such as floods, droughts, and derechos, are taking a more proactive role in preparing for future incursions. A recent study by Federal and water foundation investigators, reveals how six US water utilities and their regions prepared for, responded to, and coped with recent extreme weather and climate events and the lessons they are using to plan future adaptation and resilience activities. Two case studies will be highlighted. (1) Sonoma County, CA, has had alternating floods and severe droughts. In 2009, this area, home to competing water users, namely, agricultural crops, wineries, tourism, and fisheries faced a three-year drought, accompanied at the end by intense frosts. Competing uses of water threatened the grape harvest, endangered the fish industry and resulted in a series of regulations, and court cases. Five years later, new efforts by partners in the entire watershed have identified mutual opportunities for increased basin sustainability in the face of a changing climate. (2) Washington DC had a derecho in late June 2012, which curtailed water, communications, and power delivery during a record heat spell that impacted hundreds of thousands of residents and lasted over the height of the tourist-intensive July 4th holiday. Lessons from this event were applied three months later in anticipation of an approaching Superstorm Sandy. This study will help other communities in improving their resiliency in the face of future climate extremes. For example, this study revealed that (1) communities are planning with multiple types and occurrences of extreme events which are becoming more severe and frequent and are impacting communities that are expanding into more vulnerable areas and (2) decisions by one sector can not be made in a vacuum and require the scientific, sectoral and citizen communities to work towards sustainable solutions.

  15. Dark Fiber and Distributed Acoustic Sensing: Applications to Monitoring Seismicity and Near-Surface Properties

    NASA Astrophysics Data System (ADS)

    Ajo Franklin, J. B.; Lindsey, N.; Dou, S.; Freifeld, B. M.; Daley, T. M.; Tracy, C.; Monga, I.

    2017-12-01

    "Dark Fiber" refers to the large number of fiber-optic lines installed for telecommunication purposes but not currently utilized. With the advent of distributed acoustic sensing (DAS), these unused fibers have the potential to become a seismic sensing network with unparalleled spatial extent and density with applications to monitoring both natural seismicity as well as near-surface soil properties. While the utility of DAS for seismic monitoring has now been conclusively shown on built-for-purpose networks, dark fiber deployments have been challenged by the heterogeneity of fiber installation procedures in telecommunication as well as access limitations. However, the potential of telecom networks to augment existing broadband monitoring stations provides a strong incentive to explore their utilization. We present preliminary results demonstrating the application of DAS to seismic monitoring on a 20 km run of "dark" telecommunications fiber between West Sacramento, CA and Woodland CA, part of the Dark Fiber Testbed maintained by the DOE's ESnet user facility. We show a small catalog of local and regional earthquakes detected by the array and evaluate fiber coupling by using variations in recorded frequency content. Considering the low density of broadband stations across much of the Sacramento Basin, such DAS recordings could provide a crucial data source to constrain small-magnitude local events. We also demonstrate the application of ambient noise interferometry using DAS-recorded waveforms to estimate soil properties under selected sections of the dark fiber transect; the success of this test suggests that the network could be utilized for environmental monitoring at the basin scale. The combination of these two examples demonstrates the exciting potential for combining DAS with ubiquitous dark fiber to greatly extend the reach of existing seismic monitoring networks.

  16. Estuarine Facies Model Revisited: Conceptual Model of Estuarine Sediment Dynamics During Non-Equilibrium Conditions

    NASA Astrophysics Data System (ADS)

    Elliott, E. A.; Rodriguez, A. B.; McKee, B. A.

    2017-12-01

    Traditional models of estuarine systems show deposition occurs primarily within the central basin. There, accommodation space is high within the deep central valley, which is below regional wave base and where current energy is presumed to reach a relative minimum, promoting direct deposition of cohesive sediment and minimizing erosion. However, these models often reflect long-term (decadal-millennial) timescales, where accumulation rates are in relative equilibrium with the rate of relative sea-level rise, and lack the resolution to capture shorter term changes in sediment deposition and erosion within the central estuary. This work presents a conceptual model for estuarine sedimentation during non-equilibrium conditions, where high-energy inputs to the system reach a relative maximum in the central basin, resulting in temporary deposition and/or remobilization over sub-annual to annual timescales. As an example, we present a case study of Core Sound, NC, a lagoonal estuarine system where the regional base-level has been reached, and sediment deposition, resuspension and bypassing is largely a result of non-equilibrium, high-energy events. Utilizing a 465 cm-long sediment core from a mini-basin located between Core Sound and the continental shelf, a 40-year sub-annual chronology was developed for the system, with sediment accumulation rates (SAR) interpolated to a monthly basis over the 40-year record. This study links erosional processes in the estuary directly with sediment flux to the continental shelf, taking advantage of the highly efficient sediment trapping capability of the mini-basin. The SAR record indicates high variation in the estuarine sediment supply, with peaks in the SAR record at a recurrence interval of 1 year (+/- 0.25). This record has been compared to historical storm influence for the area. Through this multi-decadal record, sediment flushing events occur at a much more frequent interval than previously thought (i.e. annual rather than decadal timescales). This non-equilibrium estuarine model highlights moderate-energy events that impact the coast at least every year, in addition to high energy less frequent decadal to millennial events for modulating sediment and particulate matter erosion and transport through the estuary and delivery to the continental shelf.

  17. The Incidence, Nature and Consequences of Adverse Events in Iranian Hospitals.

    PubMed

    Akbari Sari, Ali; Doshmangir, Leila; Torabi, Fereshteh; Rashidian, Arash; Sedaghat, Mojtaba; Ghomi, Robabeh; Prasopa-Plaizier, Nittita

    2015-12-01

    Adverse events are relatively common in healthcare, leading to extensive harm to patients and a significant drain on healthcare resources. Identifying the extent, nature and consequences of adverse events is an important step in preventing adverse events and their consequences which is the subject of this study. This is a retrospective review of medical records randomly selected from patients admitted to 4 general hospitals, staying more than 24 hours and discharged between April and September 2012. We randomly selected 1200 records and completed the record review for 1162 of these records. Standard forms (RF1 and RF2) were used to review medical records in two stages by nurses and medical doctors. Eighty-five (7.3%) of the 1162 records had an adverse event during the admission; and in 43 (3.7%) of the 1162 records, the patient was admitted to the hospital due to an adverse event that occurred before the admission. Therefore, a total of 128 (11.0%) adverse events occurred in 126 (10.9) records as two patients had more than one adverse event. Forty-four (34.3%) of these 128 adverse events were considered preventable. This study confirms that adverse events, particularly adverse drug reactions, post-operative infections, bedsore and hospital acquired infections are common and potentially preventable sources of harm to patients in Iranian hospitals.

  18. Evaluating Monitoring Strategies to Detect Precipitation-Induced Microbial Contamination Events in Karstic Springs Used for Drinking Water

    PubMed Central

    Besmer, Michael D.; Hammes, Frederik; Sigrist, Jürg A.; Ort, Christoph

    2017-01-01

    Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly). This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days), for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM). We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1) analyze any relevant event and (2) limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their current monitoring strategies. PMID:29213255

  19. Evaluating Monitoring Strategies to Detect Precipitation-Induced Microbial Contamination Events in Karstic Springs Used for Drinking Water.

    PubMed

    Besmer, Michael D; Hammes, Frederik; Sigrist, Jürg A; Ort, Christoph

    2017-01-01

    Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly). This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days), for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM). We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1) analyze any relevant event and (2) limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their current monitoring strategies.

  20. A retrospective review of safety using a nursing driven protocol for autonomic dysreflexia in patients with spinal cord injuries.

    PubMed

    Solinsky, Ryan; Svircev, Jelena N; James, Jennifer J; Burns, Stephen P; Bunnell, Aaron E

    2016-11-01

    Autonomic dysreflexia is a potentially life-threatening condition which afflicts a significant proportion of individuals with spinal cord injuries (SCI). To date, the safety and efficacy of several commonly used interventions for this condition have not been studied. A retrospective chart review of the safety of a previously implemented nursing driven inpatient autonomic dysreflexia protocol. Seventy-eight male patients with SCI who experienced autonomic dysreflexia while inpatient at our Veterans Affairs SCI unit over a 3-1/2-year period were included. The safety of a nursing driven protocol utilizing conservative measures, nitroglycerin paste, and oral hydralazine was evaluated. Occurrence of adverse events and relative hypotensive events during all episodes treated with the protocol, and efficacy of attaining target blood pressure for all episodes with protocol adherence and for initial episode experienced by each patient. Four hundred forty-five episodes of autonomic dysreflexia were recorded in the study period, with 92% adherence to the protocol. When the protocol was followed, target blood pressure was achieved for 97.6% of all episodes. Twenty-three total adverse events occurred (5.2% of all episodes). All adverse events were due to hypotension and only 0.9% required interventions beyond clinical monitoring. Of each patient's initial autonomic dysreflexia episode, 97.3% resolved using the protocol without need for further escalation of care. This inpatient nursing driven-protocol for treating autonomic dysreflexia utilizing conservative measures, nitroglycerin paste and oral hydralazine achieved target blood pressure with a high success rate and a low incidence of adverse events.

  1. Quantifying the utilization of medical devices necessary to detect postmarket safety differences: A case study of implantable cardioverter defibrillators.

    PubMed

    Bates, Jonathan; Parzynski, Craig S; Dhruva, Sanket S; Coppi, Andreas; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Shaw, Richard E; Warner, Frederick; Krumholz, Harlan M; Ross, Joseph S

    2018-06-12

    To estimate medical device utilization needed to detect safety differences among implantable cardioverter defibrillators (ICDs) generator models and compare these estimates to utilization in practice. We conducted repeated sample size estimates to calculate the medical device utilization needed, systematically varying device-specific safety event rate ratios and significance levels while maintaining 80% power, testing 3 average adverse event rates (3.9, 6.1, and 12.6 events per 100 person-years) estimated from the American College of Cardiology's 2006 to 2010 National Cardiovascular Data Registry of ICDs. We then compared with actual medical device utilization. At significance level 0.05 and 80% power, 34% or fewer ICD models accrued sufficient utilization in practice to detect safety differences for rate ratios <1.15 and an average event rate of 12.6 events per 100 person-years. For average event rates of 3.9 and 12.6 events per 100 person-years, 30% and 50% of ICD models, respectively, accrued sufficient utilization for a rate ratio of 1.25, whereas 52% and 67% for a rate ratio of 1.50. Because actual ICD utilization was not uniformly distributed across ICD models, the proportion of individuals receiving any ICD that accrued sufficient utilization in practice was 0% to 21%, 32% to 70%, and 67% to 84% for rate ratios of 1.05, 1.15, and 1.25, respectively, for the range of 3 average adverse event rates. Small safety differences among ICD generator models are unlikely to be detected through routine surveillance given current ICD utilization in practice, but large safety differences can be detected for most patients at anticipated average adverse event rates. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    PubMed

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664 ± 619 rad/s). The current data indicate that existing wearable sensor technologies may substantially overestimate head impact events. Further, while the wearable sensors always estimated a head impact location, only 48% of the impacts were a result of direct contact to the head as characterized on video. Using wearable sensors and video to verify head impacts may decrease the inclusion of false-positive impacts during game activity in the analysis.

  3. Using immersive simulation for training first responders for mass casualty incidents.

    PubMed

    Wilkerson, William; Avstreih, Dan; Gruppen, Larry; Beier, Klaus-Peter; Woolliscroft, James

    2008-11-01

    A descriptive study was performed to better understand the possible utility of immersive virtual reality simulation for training first responders in a mass casualty event. Utilizing a virtual reality cave automatic virtual environment (CAVE) and high-fidelity human patient simulator (HPS), a group of experts modeled a football stadium that experienced a terrorist explosion during a football game. Avatars (virtual patients) were developed by expert consensus that demonstrated a spectrum of injuries ranging from death to minor lacerations. A group of paramedics was assessed by observation for decisions made and action taken. A critical action checklist was created and used for direct observation and viewing videotaped recordings. Of the 12 participants, only 35.7% identified the type of incident they encountered. None identified a secondary device that was easily visible. All participants were enthusiastic about the simulation and provided valuable comments and insights. Learner feedback and expert performance review suggests that immersive training in a virtual environment has the potential to be a powerful tool to train first responders for high-acuity, low-frequency events, such as a terrorist attack.

  4. Spatio-temporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach: Nearest-neighbor analysis of Oklahoma

    DOE PAGES

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    2017-06-24

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  5. Spatiotemporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach

    NASA Astrophysics Data System (ADS)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    2017-07-01

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog's inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable with respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.

  6. Spatio-temporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach: Nearest-neighbor analysis of Oklahoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  7. Bruxism force detection by a piezoelectric film-based recording device in sleeping humans.

    PubMed

    Baba, Kazuyoshi; Clark, Glenn T; Watanabe, Tatsutomi; Ohyama, Takashi

    2003-01-01

    To test the reliability and utility of a force-based bruxism detection system (Intra-Splint Force Detector [ISFD]) for multiple night recordings of forceful tooth-to-splint contacts in sleeping human subjects in their home environment. Bruxism-type forces, i.e., forceful tooth-to-splint contacts, during the night were recorded with this system in 12 subjects (6 bruxers and 6 controls) for 5 nights in their home environment; a laboratory-based nocturnal polysomnogram (NPSG) study was also performed on 1 of these subjects. All 12 subjects were able to use the device without substantial difficulty on a nightly basis. The bruxer group exhibited bruxism events of significantly longer duration than the control group (27 seconds/hour versus 7.4 seconds/hour, P < .01). A NPSG study performed on 1 subject revealed that, when the masseter muscle electromyogram (EMG) was used as a "gold standard," the ISFD had a sensitivity of 0.89. The correlation coefficient between the duration of events detected by the ISFD and the EMG was also 0.89. These results suggest that the ISFD is a system that can be used easily by the subjects and that has a reasonable reliability for bruxism detection as reflected in forceful tooth-to-splint contacts during sleep.

  8. Sensitivity analysis of the FEMA HAZUS-MH MR4 Earthquake Model using seismic events affecting King County Washington

    NASA Astrophysics Data System (ADS)

    Neighbors, C.; Noriega, G. R.; Caras, Y.; Cochran, E. S.

    2010-12-01

    HAZUS-MH MR4 (HAZards U. S. Multi-Hazard Maintenance Release 4) is a risk-estimation software developed by FEMA to calculate potential losses due to natural disasters. Federal, state, regional, and local government use the HAZUS-MH Earthquake Model for earthquake risk mitigation, preparedness, response, and recovery planning (FEMA, 2003). In this study, we examine several parameters used by the HAZUS-MH Earthquake Model methodology to understand how modifying the user-defined settings affect ground motion analysis, seismic risk assessment and earthquake loss estimates. This analysis focuses on both shallow crustal and deep intraslab events in the American Pacific Northwest. Specifically, the historic 1949 Mw 6.8 Olympia, 1965 Mw 6.6 Seattle-Tacoma and 2001 Mw 6.8 Nisqually normal fault intraslab events and scenario large-magnitude Seattle reverse fault crustal events are modeled. Inputs analyzed include variations of deterministic event scenarios combined with hazard maps and USGS ShakeMaps. This approach utilizes the capacity of the HAZUS-MH Earthquake Model to define landslide- and liquefaction- susceptibility hazards with local groundwater level and slope stability information. Where Shakemap inputs are not used, events are run in combination with NEHRP soil classifications to determine site amplification effects. The earthquake component of HAZUS-MH applies a series of empirical ground motion attenuation relationships developed from source parameters of both regional and global historical earthquakes to estimate strong ground motion. Ground motion and resulting ground failure due to earthquakes are then used to calculate, direct physical damage for general building stock, essential facilities, and lifelines, including transportation systems and utility systems. Earthquake losses are expressed in structural, economic and social terms. Where available, comparisons between recorded earthquake losses and HAZUS-MH earthquake losses are used to determine how region coordinators can most effectively utilize their resources for earthquake risk mitigation. This study is being conducted in collaboration with King County, WA officials to determine the best model inputs necessary to generate robust HAZUS-MH models for the Pacific Northwest.

  9. Source Characterization of Underground Explosions from Combined Regional Moment Tensor and First-Motion Analysis

    DOE PAGES

    Chiang, Andrea; Dreger, Douglas S.; Ford, Sean R.; ...

    2014-07-08

    Here in this study, we investigate the 14 September 1988 U.S.–Soviet Joint Verification Experiment nuclear test at the Semipalatinsk test site in eastern Kazakhstan and two nuclear explosions conducted less than 10 years later at the Chinese Lop Nor test site. These events were very sparsely recorded by stations located within 1600 km, and in each case only three or four stations were available in the regional distance range. We have utilized a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long-period waveforms and first-motionmore » observations provides a unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We demonstrate through a series of jackknife tests and sensitivity analyses that the source type of the explosions is well constrained. One event, a 1996 Lop Nor shaft explosion, displays large Love waves and possibly reversed Rayleigh waves at one station, indicative of a large F-factor. We show the combination of long-period waveforms and P-wave first motions are able to discriminate this event as explosion-like and distinct from earthquakes and collapses. We further demonstrate the behavior of network sensitivity solutions for models of tectonic release and spall-based tensile damage over a range of F-factors and K-factors.« less

  10. Source Characterization of Underground Explosions from Combined Regional Moment Tensor and First-Motion Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Andrea; Dreger, Douglas S.; Ford, Sean R.

    Here in this study, we investigate the 14 September 1988 U.S.–Soviet Joint Verification Experiment nuclear test at the Semipalatinsk test site in eastern Kazakhstan and two nuclear explosions conducted less than 10 years later at the Chinese Lop Nor test site. These events were very sparsely recorded by stations located within 1600 km, and in each case only three or four stations were available in the regional distance range. We have utilized a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long-period waveforms and first-motionmore » observations provides a unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We demonstrate through a series of jackknife tests and sensitivity analyses that the source type of the explosions is well constrained. One event, a 1996 Lop Nor shaft explosion, displays large Love waves and possibly reversed Rayleigh waves at one station, indicative of a large F-factor. We show the combination of long-period waveforms and P-wave first motions are able to discriminate this event as explosion-like and distinct from earthquakes and collapses. We further demonstrate the behavior of network sensitivity solutions for models of tectonic release and spall-based tensile damage over a range of F-factors and K-factors.« less

  11. Episodic deflation-inflation events at Kīlauea Volcano and implications for the shallow magma system: Chapter 11

    USGS Publications Warehouse

    Anderson, Kyle R.; Poland, Michael; Johnson, Jessica H.; Miklius, Asta; Carey, Rebecca; Cayol, Valérie; Poland, Michael P.; Weis, Dominique

    2015-01-01

    Episodic variations in magma pressures and flow rates at Kīlauea Volcano, defined by a characteristic temporal evolution and termed deflation-inflation (DI) events, have been observed since at least the 1990s. DI events consist of transient, days-long deflations and subsequent reinflations of the summit region, accompanied since 2008 by fluctuations in the surface height of Kīlauea's summit lava lake. After a delay of minutes to hours, these events also often appear along the volcano's East Rift Zone in ground deformation data and as temporary reductions in eruption rate (sometimes followed by brief surges). Notable pauses in DI activity have preceded many eruptive events at Kīlauea. We analyzed more than 500 DI events recorded by borehole tiltmeters at the summit during 2000–2013. Inverse modeling suggests that DI-related ground deformation at the summit is generated by pressure transients in a shallow magma reservoir located beneath the east margin of Halema‘uma‘u Crater and that this reservoir has remained remarkably stable for more than a decade. Utilizing tilt data and variation in the level of the summit lava lake during a large DI event, we estimate a reservoir volume of approximately 1 km3 (0.2–5.5 km3 at 95% confidence).

  12. [Validation of an adverse event reporting system in primary care].

    PubMed

    de Lourdes Rojas-Armadillo, María; Jiménez-Báez, María Valeria; Chávez-Hernández, María Margarita; González-Fondón, Araceli

    2016-01-01

    Patient safety is a priority issue in health systems, due to the damage costs, institutional weakening, lack of credibility, and frustration on those who committed an error that resulted in an adverse event. There is no standardized instrument for recording, reporting, and analyzing sentinel or adverse events (AE) in primary care. Our aim was to design and validate a surveillance system for recording sentinel events, adverse events and near miss incidents in primary care. We made a review of systems for recording and reporting adverse events in primary care. Then, we proposed an instrument to record these events, and register faults in the structure and process, in primary health care units in the Instituto Mexicano del Seguro Social. We showed VENCER-MF format to 35 subjects. Out of them, 100% identified a failure in care process, 90% recorded a sentinel event, 85% identified the cause of this event, 75% of them suggested some measures for avoiding the recurrence of adverse events. We used a Cronbach's alpha of 0.6, p=0.03. The instrument VENCER-MF has a good consistency for the identification of adverse events.

  13. Earthquake Monitoring with the MyShake Global Smartphone Seismic Network

    NASA Astrophysics Data System (ADS)

    Inbal, A.; Kong, Q.; Allen, R. M.; Savran, W. H.

    2017-12-01

    Smartphone arrays have the potential for significantly improving seismic monitoring in sparsely instrumented urban areas. This approach benefits from the dense spatial coverage of users, as well as from communication and computational capabilities built into smartphones, which facilitate big seismic data transfer and analysis. Advantages in data acquisition with smartphones trade-off with factors such as the low-quality sensors installed in phones, high noise levels, and strong network heterogeneity, all of which limit effective seismic monitoring. Here we utilize network and array-processing schemes to asses event detectability with the MyShake global smartphone network. We examine the benefits of using this network in either triggered or continuous modes of operation. A global database of ground motions measured on stationary phones triggered by M2-6 events is used to establish detection probabilities. We find that the probability of detecting an M=3 event with a single phone located <10 km from the epicenter exceeds 70%. Due to the sensor's self-noise, smaller magnitude events at short epicentral distances are very difficult to detect. To increase the signal-to-noise ratio, we employ array back-projection techniques on continuous data recorded by thousands of phones. In this class of methods, the array is used as a spatial filter that suppresses signals emitted from shallow noise sources. Filtered traces are stacked to further enhance seismic signals from deep sources. We benchmark our technique against traditional location algorithms using recordings from California, a region with large MyShake user database. We find that locations derived from back-projection images of M 3 events recorded by >20 nearby phones closely match the regional catalog locations. We use simulated broadband seismic data to examine how location uncertainties vary with user distribution and noise levels. To this end, we have developed an empirical noise model for the metropolitan Los-Angeles (LA) area. We find that densities larger than 100 stationary phones/km2 are required to accurately locate M 2 events in the LA basin. Given the projected MyShake user distribution, that condition may be met within the next few years.

  14. Developing Fluorescence Sensor Systems for Early Detection of Nitrification Events in Chloraminated Drinking Water Distribution Systems

    EPA Science Inventory

    Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events ...

  15. Using weather data to determine dry and wet periods relative to ethnographic records

    NASA Astrophysics Data System (ADS)

    Felzer, B. S.; Jiang, M.; Cheng, R.; Ember, C. R.

    2017-12-01

    Ethnographers record flood or drought events that affect a society's food supply and can be interpreted in terms of a society's ability to adapt to extreme events. Using daily weather station data from the Global Historical Climatology Network for wet events, and monthly gridded climatic data from the Climatic Research Unit for drought events, we determine if it is possible to relate these measured data to the ethnographic records. We explore several drought and wetness indices based on temperature and precipitation, as well as the Colwell method to determine the predictability, seasonality, and variability of these extreme indices. Initial results indicate that while it is possible to capture the events recorded in the ethnographic records, there are many more "false" captures of events that are not recorded in these records. Although extreme precipitation is a poor indicator of floods due to antecedent moisture conditions, even using streamflow for selected sites produces false captures. Relating drought indices to actual food supply as measured in crop yield only related to minimum crop yield in half the cases. Further mismatches between extreme precipitation and drought indices and ethnographic records may relate to the fact that only extreme events that affect food supply are recorded in the ethnographic records or that not all events are recorded by the ethnographers. We will present new results on how predictability measures relate to the ethnographic disasters. Despite the highlighted technical challenges, our results provide a historic perspective linking environmental stressors with socio-economic impacts, which in turn, will underpin the current efforts of risk assessment in a changing environment.

  16. Recording event-related activity under hostile magnetic resonance environment: Is multimodal EEG/ERP-MRI recording possible?

    PubMed

    Karakaş, H M; Karakaş, S; Ozkan Ceylan, A; Tali, E T

    2009-08-01

    Event-related potentials (ERPs) have high temporal resolution, but insufficient spatial resolution; the converse is true for the functional imaging techniques. The purpose of the study was to test the utility of a multimodal EEG/ERP-MRI technique which combines electroencephalography (EEG) and magnetic resonance imaging (MRI) for a simultaneously high temporal and spatial resolution. The sample consisted of 32 healthy young adults of both sexes. Auditory stimuli were delivered according to the active and passive oddball paradigms in the MRI environment (MRI-e) and in the standard conditions of the electrophysiology laboratory environment (Lab-e). Tasks were presented in a fixed order. Participants were exposed to the recording environments in a counterbalanced order. EEG data were preprocessed for MRI-related artifacts. Source localization was made using a current density reconstruction technique. The ERP waveforms for the MRI-e were morphologically similar to those for the Lab-e. The effect of the recording environment, experimental paradigm and electrode location were analyzed using a 2x2x3 analysis of variance for repeated measures. The ERP components in the two environments showed parametric variations and characteristic topographical distributions. The calculated sources were in line with the related literature. The findings indicated effortful cognitive processing in MRI-e. The study provided preliminary data on the feasibility of the multimodal EEG/ERP-MRI technique. It also indicated lines of research that are to be pursued for a decisive testing of this technique and its implementation to clinical practice.

  17. A novel automated detection system for swallowing sounds during eating and speech under everyday conditions.

    PubMed

    Fukuike, C; Kodama, N; Manda, Y; Hashimoto, Y; Sugimoto, K; Hirata, A; Pan, Q; Maeda, N; Minagi, S

    2015-05-01

    The wave analysis of swallowing sounds has been receiving attention because the recording process is easy and non-invasive. However, up until now, an expert has been needed to visually examine the entire recorded wave to distinguish swallowing from other sounds. The purpose of this study was to establish a methodology to automatically distinguish the sound of swallowing from sound data recorded during a meal in the presence of everyday ambient sound. Seven healthy participants (mean age: 26·7 ± 1·3 years) participated in this study. A laryngeal microphone and a condenser microphone attached to the nostril were used for simultaneous recording. Recoding took place while participants were taking a meal and talking with a conversational partner. Participants were instructed to step on a foot pedal trigger switch when they swallowed, representing self-enumeration of swallowing, and also to achieve six additional noise-making tasks during the meal in a randomised manner. The automated analysis system correctly detected 342 out of the 352 self-enumerated swallowing events (sensitivity: 97·2%) and 479 out of the 503 semblable wave periods of swallowing (specificity: 95·2%). In this study, the automated detection system for swallowing sounds using a nostril microphone was able to detect the swallowing event with high sensitivity and specificity even under the conditions of daily life, thus showing potential utility in the diagnosis or screening of dysphagic patients in future studies. © 2014 John Wiley & Sons Ltd.

  18. Analysis of organic vapors with laser induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nozari, Hadi; Tavassoli, Seyed Hassan; Rezaei, Fatemeh, E-mail: fatemehrezaei@kntu.ac.ir

    2015-09-15

    In this paper, laser induced breakdown spectroscopy (LIBS) is utilized in the study of acetone, ethanol, methanol, cyclohexane, and nonane vapors. Carbon, hydrogen, oxygen, and nitrogen atomic emission spectra have been recorded following laser-induced breakdown of the organic vapors that are mixed with air inside a quartz chamber at atmospheric pressure. The plasma is generated with focused, Q-switched Nd:YAG radiation at the wavelength of 1064 nm. The effects of ignition and vapor pressure are discussed in view of the appearance of the emission spectra. The recorded spectra are proportional to the vapor pressure in air. The hydrogen and oxygen contributions diminishmore » gradually with consecutive laser-plasma events without gas flow. The results show that LIBS can be used to characterize organic vapor.« less

  19. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  20. Resource utilization and charges of patients with and without diagnosed venous thromboembolism during primary hospitalization and after elective inpatient surgery: a retrospective study.

    PubMed

    Sepassi, Aryana; Chingcuanco, Francine; Gordon, Ronald; Meier, Angela; Divino, Victoria; DeKoven, Mitch; Ben-Joseph, Rami

    2018-06-01

    To assess incremental charges of patients experiencing venous thromboembolisms (VTE) across various types of elective inpatient surgical procedures with administration of general anesthesia in the US. The authors performed a retrospective study utilizing data from a nationwide hospital operational records database from July 2014 through June 2015 to compare a group of inpatients experiencing a VTE event post-operatively to a propensity score matched group of inpatients who did not experience a VTE. Patients included in the analysis had a hospital admission for an elective inpatient surgical procedure with the use of general anesthesia. Procedures of the heart, brain, lungs, and obstetrical procedures were excluded, as these procedures often require a scheduled ICU stay post-operatively. Outcomes examined included VTE events during hospitalization, length of stay, unscheduled ICU transfers, number of days spent in the ICU if transferred, 3- and 30-day re-admissions, and total hospital charges incurred. The study included 17,727 patients undergoing elective inpatient surgical procedures. Of these, 36 patients who experienced a VTE event were matched to 108 patients who did not. VTE events occurred in 0.2% of the study population, with most events occurring for patients undergoing total knee replacement. VTE patients had a mean total hospital charge of $60,814 vs $48,325 for non-VTE patients, resulting in a mean incremental charge of $11,979 (p < .05). Compared to non-VTE patients, VTE patients had longer length of stay (5.9 days vs 3.7 days, p < .001), experienced a higher rate of 3-day re-admissions (3 vs 0 patients) and 30-day re-admissions (7 vs 2 patients). Patients undergoing elective inpatient surgical procedures with general anesthesia who had a VTE event during their primary hospitalization had a significantly longer length of stay and significantly higher total hospital charges than comparable patients without a VTE event.

  1. A Roadmap for Recovery/Decontamination Plan for Critical Infrastructure after CBRN Event Involving Drinking Water Utilities: Scoping Study

    DTIC Science & Technology

    2014-05-01

    A Roadmap for Recovery/Decontamination Plan for Critical Infrastructure after CBRN Event Involving Drinking Water Utilities: Scoping Study... Drinking Water Utilities was supported by the Canadian Safety and Security Program (CSSP) which is led by Defence Research and Development Canada’s Centre...after CBRN Event Involving Drinking Water Utilities Scoping Study Prepared by: Vladimir Blinov Konstantin Volchek Emergencies Science and

  2. Engagement Skills Trainer: The Commander’s Perspective

    DTIC Science & Technology

    2017-06-09

    recommends using EST as a record of fire for a sustainment training event . This record of fire event can only occur once per year and after a live fire...mandatory part of marksmanship training. The author also recommends using EST as a record of fire for a sustainment training event . This record of fire... event can only occur once per year and after a live fire qualification. v ACKNOWLEDGMENTS The author would like to thank the following persons

  3. Retroperitoneal Hemorrhage After Percutaneous Coronary Intervention: Incidence, Determinants, and Outcomes as Recorded by the British Cardiovascular Intervention Society.

    PubMed

    Kwok, Chun Shing; Kontopantelis, Evangelos; Kinnaird, Tim; Potts, Jessica; Rashid, Muhammad; Shoaib, Ahmad; Nolan, James; Bagur, Rodrigo; de Belder, Mark A; Ludman, Peter; Mamas, Mamas A

    2018-02-01

    Retroperitoneal hemorrhage (RH) is a rare bleeding complication of percutaneous coronary intervention, which can result as a consequence of femoral access or can occur spontaneously. This study aims to evaluate temporal changes in RH, its predictors, and clinical outcomes in a national cohort of patients undergoing percutaneous coronary intervention in the United Kingdom. We analyzed RH events in patients who underwent percutaneous coronary intervention between 2007 and 2014. Multiple logistic regression models were used to identify factors associated with RH and to quantify the association between RH and 30-day mortality and major adverse cardiovascular events. A total of 511 106 participants were included, and 291 in hospital RH events were recorded (0.06%). Overall, rates of RH declined from 0.09% to 0.03% between 2007 and 2014. The strongest independent predictors of RH events were femoral access (odds ratio [OR], 19.66; 95% confidence interval [CI], 11.22-34.43), glycoprotein IIb/IIIa inhibitor (OR, 2.63; 95% CI, 1.99-3.47), and warfarin use (OR, 2.53; 95% CI, 1.07-5.99). RH was associated with a significant increase in 30-day mortality (OR, 3.59; 95% CI, 2.19-5.90) and in-hospital major adverse cardiovascular events (OR, 5.76; 95% CI, 3.71-8.95). A legacy effect was not observed; patients with RH who survived 30 days did not have higher 1-year mortality compared with those without this complication (hazard ratio, 0.97; 95% CI, 0.49-1.91). Our results suggest that RH is a rare event that is declining in the United Kingdom, related to transition to transradial access site utilization, but remains a clinically important event associated with increased 30-day mortality but no long-term legacy effect. © 2018 American Heart Association, Inc.

  4. A comparison of selected models for estimating cable icing

    NASA Astrophysics Data System (ADS)

    McComber, Pierre; Druez, Jacques; Laflamme, Jean

    In many cold climate countries, it is becoming increasingly important to monitor transmission line icing. Indeed, by knowing in advance of localized danger for icing overloads, electric utilities can take measures in time to prevent generalized failure of the power transmission network. Recently in Canada, a study was made to compare the estimation of a few icing models working from meteorological data in estimating ice loads for freezing rain events. The models tested were using only standard meteorological parameters, i.e. wind speed and direction, temperature and precipitation rate. This study has shown that standard meteorological parameters can only achieve very limited accuracy, especially for longer icing events. However, with the help of an additional instrument monitoring the icing rate intensity, a significant improvement in model prediction might be achieved. The icing rate meter (IRM) which counts icing and de-icing cycles per unit time on a standard probe can be used to estimate the icing intensity. A cable icing estimation is then made by taking into consideration the accretion size, temperature, wind speed and direction, and precipitation rate. In this paper, a comparison is made between the predictions of two previously tested models (one obtained and the other reconstructed from their description in the public literature) and of a model based on the icing rate meter readings. The models are tested against nineteen events recorded on an icing test line at Mt. Valin, Canada, during the winter season 1991-1992. These events are mostly rime resulting from in-cloud icing. However, freezing rain and wet snow events were also recorded. Results indicate that a significant improvement in the estimation is attained by using the icing rate meter data together with the other standard meteorological parameters.

  5. Leveraging Social Norms to Improve Leak Resolution Outcomes Across Meter Classes:

    NASA Astrophysics Data System (ADS)

    Holleran, W.

    2016-12-01

    Over the past decade, utilities, governments, businesses, and nonprofits have come to realize that more than just financial considerations and information drive behavior. Social and psychological factors also play a significant role in shaping consumers' decisions and behaviors around resource use. Stakeholders have consequently turned their interest to behavioral science, a multidisciplinary field that draws from psychology, sociology, public health, and behavioral economics to explain the complex mechanisms that shape human behavior. When used strategically, behavioral science holds the potential to drive down resource use, drive up profits, and generate measurable gains in conservation and efficiency. WaterSmart will present on how the water sector can employ behavioral science to nudge residential rate-payers to use water more efficiently and help them save money. Utilities can use behavioral science to influence people's reaction to leaks. 5% of Single Family Residential (SFR) metered water use can be attributed to leaks. This value potentially skews even higher for MultiFamily (MF) and Commercial accounts given that it can get lost in the noise of daily consumption. Existing leak detection algorithms in the market are not sophisticated enough to detect leaks for a MF or Commercial property. Leveraging data from utilities on known leak events at MF and Commercial buildings allowed WaterSmart to train a machine learning model to identify key features in the load shape and accurately detect these types of water use events. The outcome of the model is a leak amount and confidence level for each irregular usage event. The model also incorporates record feedback from users on the type of leak event, and the accuracy of the alert. When WaterSmart leverages this data model with social norms messaging, we've been able to improve water demand management for MF and Commercial properties. Experiences from leak detection and resolution in the SFR space will also be discussed.

  6. The threshold between storm overwash and inundation and the implication to paleo-storm records and climate signatures.

    NASA Astrophysics Data System (ADS)

    Smith, C. G.; Long, J.; Osterman, L. E.; Plant, N. G.; Marot, M. E.; Bernier, J.; Flocks, J. G.; Adams, C. S.

    2014-12-01

    In modern coastal systems, the sensitivity of a coastal site to erosion or deposition during storm conditions depends largely on the geomorphic configuration (e.g. dune or beach height and width) and the storm-induced oceanographic processes (surge and waves). Depending on the magnitude of these variables, coastal systems may be eroded, overwashed, breached, and/or inundated during the storm. To date, there has been no attempt to evaluate how these observable modern differences in storm-impact regimes might be utilized to interpret paleo-storm intensities and frequencies. Time-series of sediment texture, radioisotopic, and foraminiferal data from back-barrier environments along the Chandeleur Islands (Louisiana, USA) document the emplacement of a storm event deposit from Hurricane Isaac and we use this event to test paleo-storm intensity reconstruction methods. Water level reconstructed for the event layer using an advection (grain-size) settling model are 2 - 3 times greater than measured during the storm. The over-estimation is linked to the reconstruction model's assumptions concerning sediment transport during storms (i.e., overwash only), while actual processes included inundation as well. These contrasts may result in misidentification (i.e., presence/absence) and/or misclassification (i.e., intensity) of storms in the geologic record (e.g., low geomorphic conditions and high water levels) that would in turn affect the ability to link storm frequency or intensity to climatic drivers.

  7. Age-specific vibrissae growth rates: a tool for determining the timing of ecologically important events in Steller sea lions

    USGS Publications Warehouse

    Rea, L.D.; Christ, A.M.; Hayden, A.B.; Stegall, V.K.; Farley, S.D.; Stricker, Craig A.; Mellish, J.E.; Maniscalco, John M.; Waite, J.N.; Burkanov, V.N.; Pitcher, K.W.

    2015-01-01

    Steller sea lions (SSL; Eumetopias jubatus) grow their vibrissae continually, providing a multiyear record suitable for ecological and physiological studies based on stable isotopes. An accurate age-specific vibrissae growth rate is essential for registering a chronology along the length of the record, and for interpreting the timing of ecologically important events. We utilized four methods to estimate the growth rate of vibrissae in fetal, rookery pup, young-of-the-year (YOY), yearling, subadult, and adult SSL. The majority of vibrissae were collected from SSL live-captured in Alaska and Russia between 2000 and 2013 (n = 1,115), however, vibrissae were also collected from six adult SSL found dead on haul-outs and rookeries during field excursions to increase the sample size of this underrepresented age group. Growth rates of vibrissae were generally slower in adult (0.44 ± 0.15 cm/mo) and subadult (0.61 ± 0.10 cm/mo) SSL than in YOY (0.87 ± 0.28 cm/mo) and fetal (0.73 ± 0.05 cm/mo) animals, but there was high individual variability in these growth rates within each age group. Some variability in vibrissae growth rates was attributed to the somatic growth rate of YOY sea lions between capture events (P = 0.014, r2 = 0.206, n = 29).

  8. Storm Event Suspended Sediment-Discharge Hysteresis and Controls in Agricultural Watersheds: Implications for Watershed Scale Sediment Management.

    PubMed

    Sherriff, Sophie C; Rowan, John S; Fenton, Owen; Jordan, Philip; Melland, Alice R; Mellander, Per-Erik; hUallacháin, Daire Ó

    2016-02-16

    Within agricultural watersheds suspended sediment-discharge hysteresis during storm events is commonly used to indicate dominant sediment sources and pathways. However, availability of high-resolution data, qualitative metrics, longevity of records, and simultaneous multiwatershed analyses has limited the efficacy of hysteresis as a sediment management tool. This two year study utilizes a quantitative hysteresis index from high-resolution suspended sediment and discharge data to assess fluctuations in sediment source location, delivery mechanisms and export efficiency in three intensively farmed watersheds during events over time. Flow-weighted event sediment export was further considered using multivariate techniques to delineate rainfall, stream hydrology, and antecedent moisture controls on sediment origins. Watersheds with low permeability (moderately- or poorly drained soils) with good surface hydrological connectivity, therefore, had contrasting hysteresis due to source location (hillslope versus channel bank). The well-drained watershed with reduced connectivity exported less sediment but, when watershed connectivity was established, the largest event sediment load of all watersheds occurred. Event sediment export was elevated in arable watersheds when low groundcover was coupled with high connectivity, whereas in the grassland watershed, export was attributed to wetter weather only. Hysteresis analysis successfully indicated contrasting seasonality, connectivity and source availability and is a useful tool to identify watershed specific sediment management practices.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgardt, D.R.; Carter, S.; Maxson, M.

    The objective of this project is to design and develop an Intelligent Event Identification System, or ISEIS, which will be a prototype for routine event identification of small explosions and earthquakes and to serve as a tool for discrimination research. The first part of this study gives an overview of the system design and the results of a preliminary evaluation of the system on events in Scandinavia and the Soviet Union. The system was designed to be highly modular to allow the easy incorporation of new discriminants and/or discrimination processes. Because the main objective of the system is the identificationmore » of small events, most of the initial ISEIS prototype discriminants utilize regional seismic data recorded by the regional arrays, NORESS and ARCESS. However, ISEIS can easily process other regional array data (e.g., from GERESS and FINESA), as well as data from three-component single stations, as more of this data becomes available. The second part of this study is entitled Intelligent Event Identification System: User's Manual, and gives a detailed description of all the processing interfaces of ISEIS. The third part of this study is entitled Intelligent Event Identification System: Software Maintenance Manual, which describes the ISEIS software from the programmer's perspective and provides information for maintenance and modification of the software modules in the system.« less

  10. Music Genre as a Predictor of Resource Utilization at Outdoor Music Concerts.

    PubMed

    Westrol, Michael S; Koneru, Susmith; McIntyre, Norah; Caruso, Andrew T; Arshad, Faizan H; Merlin, Mark A

    2017-06-01

    The aim of this study was to examine the various modern music genres and their effect on the utilization of medical resources with analysis and adjustment for potential confounders. A retrospective review of patient logs from an open-air, contemporary amphitheater over a period of 10 years was performed. Variables recorded by the medical personnel for each concert included the attendance, description of the weather, and a patient log in which nature and outcome were recorded. The primary outcomes were associations of genres with the medical usage rate (MUR). Secondary outcomes investigated were the association of confounders and the influences on the level of care provided, the transport rate, and the nature of medical complaint. A total of 2,399,864 concert attendees, of which 4,546 patients presented to venue Emergency Medical Services (EMS) during 403 concerts with an average of 11.4 patients (annual range 7.1-17.4) each concert. Of potential confounders, only the heat index ≥90°F (32.2°C) and whether the event was a festival were significant (P=.027 and .001, respectively). After adjustment, the genres with significantly increased MUR in decreasing order were: alternative rock, hip-hop/rap, modern rock, heavy metal/hard rock, and country music (P<.05). Medical complaints were significantly increased with alternative rock or when the heat index was ≥90°F (32.2°C; P<.001). Traumatic injuries were most significantly increased with alternative rock (P<.001). Alcohol or drug intoxication was significantly more common in hip-hop/rap (P<.001). Transport rates were highest with alcohol/drug intoxicated patients (P<.001), lowest with traumatic injuries (P=.004), and negatively affected by heat index ≥90°F (32.2°C; P=.008), alternative rock (P=.017), and country music (P=.033). Alternative rock, hip-hop/rap, modern rock, heavy metal/hard rock, and country music concerts had higher levels of medical resource utilization. High heat indices and music festivals also increase the MUR. This information can assist event planners with preparation and resource utilization. Future research should focus on prospective validation of the regression equation. Westrol MS , Koneru S , McIntyre N , Caruso AT , Arshad FH , Merlin MA . Music genre as a predictor of resource utilization at outdoor music concerts. Prehosp Disaster Med. 2017;32(3):289-296.

  11. Is detection of adverse events affected by record review methodology? an evaluation of the "Harvard Medical Practice Study" method and the "Global Trigger Tool".

    PubMed

    Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin

    2013-04-15

    There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study" method than using the "Global Trigger Tool". Differences in review methodology, perception of less severe adverse events and context knowledge may explain the observed difference between two expert review teams in the detection of adverse events.

  12. Borehole Strainmeters and the monitoring of the North Anatolian Fault in the Marmara Sea.

    NASA Astrophysics Data System (ADS)

    Johnson, W.; Mencin, D.; Bilham, R. G.; Gottlieb, M. H.; Van Boskirk, E.; Hodgkinson, K. M.; Mattioli, G. S.; Acarel, D.; Bulut, F.; Bohnhoff, M.; Ergintav, S.; Bal, O.; Ozener, H.

    2016-12-01

    Twice in the past 1000 years a sequence of large earthquakes has propagated from east to west along the North Anatolian fault (NAF) in Turkey towards Istanbul, with the final earthquake in the sequence destroying the city. This occurred most recently in 1509. The population of greater Istanbul is 20 million and the next large earthquake of the current sequence is considered imminent. The most likely location for a major earthquake on the NAF is considered the Marmara-Sea/Princes-Island segment south and southeast of Istanbul [Bohnhoff et al., 2013]. Insights into the nucleation and future behavior of this segment of the NAF are anticipated from measuring deformation near the fault, and in particular possible aseismic slip processes on the fault that may precede as well as accompany any future rupture. Aseismic slip processes near the western end of the Izmit rupture, near where it passes offshore beneath the Sea of Marmara near Izmit, has been successfully monitored using InSAR, GPS, and creepmeters. A 1mm amplitude, 24h creep event was recorded by our creepmeter near Izmit in 2015. These instruments and methods are of limited utility in monitoring the submarine portion of the NAF Data from numerous borehole strainmeters (BSM) along the San Andreas Fault, including those that were installed and maintained as part of the EarthScope Plate Boundary Observatory (PBO), demonstrate that the characteristics of creep propagation events with sub-cm slip amplitudes can be quantified for slip events at 10 km source-to-sensor distances. Such distances are comparable to those between the mainland and the submarine NAF, with some islands allowing installations within 3 km of the fault. In a collaborative program (GeoGONAF) between the National Science Foundation, GeoForschungsZentrum, Turkish Disaster and Emergency Management Authority, and the Kandilli Observatory, we installed an array of six PBO type BSM systems, which include strainmeters and seismometers, around the eastern end of the Marmara. The sensors are installed at depths of 100 m and record at a rate of 100Hz. During the installation phase (2014-16), the partially complete array successfully recorded seiches in the Sea of Marmara and a number of teleseismic events. The ESNK station, which is located to the west of Yalova is recording signals indicative of creep events.

  13. Using Derivative Contracts to Mitigate Water Utility Financial Risks

    NASA Astrophysics Data System (ADS)

    Characklis, G. W.; Zeff, H.

    2012-12-01

    As developing new supply capacity has become increasingly expensive and difficult to permit, utilities have become more reliant on temporary demand management programs, such as outdoor water use restrictions, for ensuring reliability during drought. However, a significant fraction of water utility income is often derived from the volumetric sale of water, and such restrictions can lead to substantial revenue losses. Given that many utilities set prices at levels commensurate with recovering costs, these revenue losses can leave them financially vulnerable to budgetary shortfalls during drought. This work explores approaches for mitigating drought-related revenue losses through the use of third-party financial insurance contracts based on weather derivatives. Two different types of contracts are developed, and their efficacy is compared against two more traditional forms of financial hedging used by water utilities: drought surcharges and contingency funds (i.e. self insurance). Strategies involving each of these approaches, as well as their use in combination, are applied under conditions facing the water utility serving Durham, North Carolina. A multi-reservoir model provides information on the scale and timing of droughts, with the financial effects of these events simulated using detailed data derived from utility billing records. Results suggest that third-party derivative contracts, either independently or in combination with more traditional hedging tools (i.e. surcharges, contingency funds), can provide an effective means of reducing a utility's financial vulnerability to drought.

  14. Managing water utility financial risks through third-party index insurance contracts

    NASA Astrophysics Data System (ADS)

    Zeff, Harrison B.; Characklis, Gregory W.

    2013-08-01

    As developing new supply capacity has become increasingly expensive and difficult to permit (i.e., regulatory approval), utilities have become more reliant on temporary demand management programs, such as outdoor water use restrictions, for ensuring reliability during drought. However, a significant fraction of water utility income is often derived from the volumetric sale of water, and such restrictions can lead to substantial revenue losses. Given that many utilities set prices at levels commensurate with recovering costs, these revenue losses can leave them financially vulnerable to budgetary shortfalls. This work explores approaches for mitigating drought-related revenue losses through the use of third-party financial insurance contracts based on streamflow indices. Two different types of contracts are developed, and their efficacy is compared against two more traditional forms of financial hedging used by water utilities: Drought surcharges and contingency funds (i.e., self-insurance). Strategies involving each of these approaches, as well as their use in combination, are applied under conditions facing the water utility serving Durham, North Carolina. A multireservoir model provides information on the scale and timing of droughts, and the financial effects of these events are simulated using detailed data derived from utility billing records. Results suggest that third-party index insurance contracts, either independently or in combination with more traditional hedging tools, can provide an effective means of reducing a utility's financial vulnerability to drought.

  15. Impact of an electronic medication administration record on medication administration efficiency and errors.

    PubMed

    McComas, Jeffery; Riingen, Michelle; Chae Kim, Son

    2014-12-01

    The study aims were to evaluate the impact of electronic medication administration record implementation on medication administration efficiency and occurrence of medication errors as well as to identify the predictors of medication administration efficiency in an acute care setting. A prospective, observational study utilizing time-and-motion technique was conducted before and after electronic medication administration record implementation in November 2011. A total of 156 cases of medication administration activities (78 pre- and 78 post-electronic medication administration record) involving 38 nurses were observed at the point of care. A separate retrospective review of the hospital Midas+ medication error database was also performed to collect the rates and origin of medication errors for 6 months before and after electronic medication administration record implementation. The mean medication administration time actually increased from 11.3 to 14.4 minutes post-electronic medication administration record (P = .039). In a multivariate analysis, electronic medication administration record was not a predictor of medication administration time, but the distractions/interruptions during medication administration process were significant predictors. The mean hospital-wide medication errors significantly decreased from 11.0 to 5.3 events per month post-electronic medication administration record (P = .034). Although no improvement in medication administration efficiency was observed, electronic medication administration record improved the quality of care with a significant decrease in medication errors.

  16. Detection of cough signals in continuous audio recordings using hidden Markov models.

    PubMed

    Matos, Sergio; Birring, Surinder S; Pavord, Ian D; Evans, David H

    2006-06-01

    Cough is a common symptom of many respiratory diseases. The evaluation of its intensity and frequency of occurrence could provide valuable clinical information in the assessment of patients with chronic cough. In this paper we propose the use of hidden Markov models (HMMs) to automatically detect cough sounds from continuous ambulatory recordings. The recording system consists of a digital sound recorder and a microphone attached to the patient's chest. The recognition algorithm follows a keyword-spotting approach, with cough sounds representing the keywords. It was trained on 821 min selected from 10 ambulatory recordings, including 2473 manually labeled cough events, and tested on a database of nine recordings from separate patients with a total recording time of 3060 min and comprising 2155 cough events. The average detection rate was 82% at a false alarm rate of seven events/h, when considering only events above an energy threshold relative to each recording's average energy. These results suggest that HMMs can be applied to the detection of cough sounds from ambulatory patients. A postprocessing stage to perform a more detailed analysis on the detected events is under development, and could allow the rejection of some of the incorrectly detected events.

  17. Financial impact of inaccurate Adverse Event recording post Hip Fracture surgery: Addendum to 'Adverse event recording post hip fracture surgery'.

    PubMed

    Lee, Matthew J; Doody, Kevin; Mohamed, Khalid M S; Butler, Audrey; Street, John; Lenehan, Brian

    2018-02-15

    A study in 2011 by (Doody et al. Ir Med J 106(10):300-302, 2013) looked at comparing inpatient adverse events recorded prospectively at the point of care, with adverse events recorded by the national Hospital In-Patient Enquiry (HIPE) System. In the study, a single-centre University Hospital in Ireland treating acute hip fractures in an orthopaedic unit recorded 39 patients over a 2-month (August-September 2011) period, with 55 adverse events recorded prospectively in contrast to the HIPE record of 13 (23.6%) adverse events. With the recent change in the Irish hospital funding model from block grant to an 'activity-based funding' on the basis of case load and case complexity, the hospital financial allocation is dependent on accurate case complexity coding. A retrospective assessment of the financial implications of the two methods of adverse incident recording was carried out. A total of €39,899 in 'missed funding' for 2 months was calculated when the ward-based, prospectively collected data was compared to the national HIPE data. Accurate data collection is paramount in facilitating activity-based funding, to improve patient care and ensure the appropriate allocation of resources.

  18. Misinterpretation of lateral acoustic variations on high-resolution seismic reflection profiles as fault offsets of Holocene bay mud beneath the southern part of San Francisco Bay, California

    USGS Publications Warehouse

    Marlow, M. S.; Hart, P.E.; Carlson, P.R.; Childs, J. R.; Mann, D. M.; Anima, R.J.; Kayen, R.E.

    1996-01-01

    We collected high-resolution seismic reflection profiles in the southern part of San Francisco Bay in 1992 and 1993 to investigate possible Holocene faulting along postulated transbay bedrock fault zones. The initial analog records show apparent offsets of reflection packages along sharp vertical boundaries. These records were originally interpreted as showing a complex series of faults along closely spaced, sharp vertical boundaries in the upper 10 m (0.013 s two-way travel time) of Holocene bay mud. A subsequent survey in 1994 was run with a different seismic reflection system, which utilized a higher power source. This second system generated records with deeper penetration (max. 20 m, 0.026 s two-way travel time) and demonstrated that the reflections originally interpreted as fault offsets by faulting were actually laterally continuous reflection horizons. The pitfall in the original interpretations was caused by lateral variations in the amplitude brightness of reflection events, coupled with a long (greater than 15 ms) source signature of the low-power system. These effects combined to show apparent offsets of reflection packages along sharp vertical boundaries. These boundaries, as shown by the second system, in fact occur where the reflection amplitude diminishes abruptly on laterally continuous reflection events. This striking lateral variation in reflection amplitude is attributable to the localized presence of biogenic(?) gas.

  19. Verification of OpenSSL version via hardware performance counters

    NASA Astrophysics Data System (ADS)

    Bruska, James; Blasingame, Zander; Liu, Chen

    2017-05-01

    Many forms of malware and security breaches exist today. One type of breach downgrades a cryptographic program by employing a man-in-the-middle attack. In this work, we explore the utilization of hardware events in conjunction with machine learning algorithms to detect which version of OpenSSL is being run during the encryption process. This allows for the immediate detection of any unknown downgrade attacks in real time. Our experimental results indicated this detection method is both feasible and practical. When trained with normal TLS and SSL data, our classifier was able to detect which protocol was being used with 99.995% accuracy. After the scope of the hardware event recording was enlarged, the accuracy diminished greatly, but to 53.244%. Upon removal of TLS 1.1 from the data set, the accuracy returned to 99.905%.

  20. Desktop microsimulation: a tool to improve efficiency in the medical office practice.

    PubMed

    Montgomery, James B; Linville, Beth A; Slonim, Anthony D

    2013-01-01

    Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.

  1. Prevalence of gender identity disorder and suicide risk among transgender veterans utilizing veterans health administration care.

    PubMed

    Blosnich, John R; Brown, George R; Shipherd Phd, Jillian C; Kauth, Michael; Piegari, Rebecca I; Bossarte, Robert M

    2013-10-01

    We estimated the prevalence and incidence of gender identity disorder (GID) diagnoses among veterans in the Veterans Health Administration (VHA) health care system and examined suicide risk among veterans with a GID diagnosis. We examined VHA electronic medical records from 2000 through 2011 for 2 official ICD-9 diagnosis codes that indicate transgender status. We generated annual period prevalence estimates and calculated incidence using the prevalence of GID at 2000 as the baseline year. We cross-referenced GID cases with available data (2009-2011) of suicide-related events among all VHA users to examine suicide risk. GID prevalence in the VHA is higher (22.9/100 000 persons) than are previous estimates of GID in the general US population (4.3/100 000 persons). The rate of suicide-related events among GID-diagnosed VHA veterans was more than 20 times higher than were rates for the general VHA population. The prevalence of GID diagnosis nearly doubled over 10 years among VHA veterans. Research is needed to examine suicide risk among transgender veterans and how their VHA utilization may be enhanced by new VA initiatives on transgender care.

  2. Prevalence of Gender Identity Disorder and Suicide Risk Among Transgender Veterans Utilizing Veterans Health Administration Care

    PubMed Central

    Brown, George R.; Shipherd, PhD, Jillian C.; Kauth, Michael; Piegari, Rebecca I.; Bossarte, Robert M.

    2013-01-01

    Objectives. We estimated the prevalence and incidence of gender identity disorder (GID) diagnoses among veterans in the Veterans Health Administration (VHA) health care system and examined suicide risk among veterans with a GID diagnosis. Methods. We examined VHA electronic medical records from 2000 through 2011 for 2 official ICD-9 diagnosis codes that indicate transgender status. We generated annual period prevalence estimates and calculated incidence using the prevalence of GID at 2000 as the baseline year. We cross-referenced GID cases with available data (2009–2011) of suicide-related events among all VHA users to examine suicide risk. Results. GID prevalence in the VHA is higher (22.9/100 000 persons) than are previous estimates of GID in the general US population (4.3/100 000 persons). The rate of suicide-related events among GID-diagnosed VHA veterans was more than 20 times higher than were rates for the general VHA population. Conclusions. The prevalence of GID diagnosis nearly doubled over 10 years among VHA veterans. Research is needed to examine suicide risk among transgender veterans and how their VHA utilization may be enhanced by new VA initiatives on transgender care. PMID:23947310

  3. Hyperinsulin therapy for calcium channel antagonist poisoning: a seven-year retrospective study.

    PubMed

    Espinoza, Tamara R; Bryant, Sean M; Aks, Steve E

    2013-01-01

    The use of hyperinsulin therapy (HIT) in severe calcium channel antagonist (CCA) poisoning has become a more common therapy within the last decade. The objective of this study is to report 7 years of experience recommending HIT. This was a retrospective chart review utilizing our regional poison center (RPC) data from January 1, 2002, through December 31, 2008. All cases of CCA poisoning receiving HIT were searched. Endpoints included the number of CCA cases utilizing HIT, insulin dose, time of initiation of HIT, patient outcome, adverse events, age, glucose concentration, and lowest systolic blood pressure recorded. Forty-six cases of CCA poisoning were managed with HIT over 7 years. All the patients received standard antidotal therapy (= intravenous fluids, calcium salts, glucagon, and pressors). HIT administration followed our RPC recommendation 23 times (50%), and no hypoglycemic events occurred. Means (age, highest glucose measured, and lowest systolic blood pressure measured) were 51 years, 282 mg/dL, and 74 mm Hg, respectively. Our RPC recommendations for HIT were followed 50% of the time over the last 7 years. In light of the lack of hypoglycemia associated with HIT in our study population, we recommend HIT as an early and safe antidote in significant CCA poisoning.

  4. Tropical and high latitude forcing of enhanced megadroughts in Northern China during the last four terminations

    NASA Astrophysics Data System (ADS)

    Tang, Changyan; Yang, Huan; Pancost, Richard D.; Griffiths, Michael L.; Xiao, Guoqiao; Dang, Xinyue; Xie, Shucheng

    2017-12-01

    Understanding the origin and evolutionary history of drought events is of great significance, providing critical insight into future hydrological conditions under the changing climate. Due to the scarcity of drought proxies from northern China, the occurrence and underlying mechanisms of the drought events remains enigmatic on longer timescales. Here we utilize microbial lipid proxies to reconstruct significant drought events over the last four ice age terminations in the southernmost section (Weinan section) of the Chinese Loess Plateau. The abundance of archaeal isoprenoid GDGTs (glycerol dialkyl glycerol tetraethers) relative to bacterial branched GDGTs, measured by Ri/b and BIT indices, is diagnostic of enhanced drought conditions. The Ri/b (and BIT) indices are stable and low (high) throughout most of the loess section spanning the last 350 thousand years, but they do exhibit sharp transient peaks (valleys) during the intervals associated with the four ice age terminations, and especially Terminations II and IV. These enhanced drought events are, non-intuitively, associated with a significant decrease in the relative abundance of C4 plants, inferred by a decrease in the carbon isotope composition of bulk organic matter. Although the microbial records show some consistency with the Weinan grain size profiles, indicative of Eastern Asian winter monsoon variability, they also show some apparent difference. In fact, some features of the microbial records exhibit strong similarities with marine sediment planktonic foraminiferal δ13C records from the western Pacific warm pool, which reflect ENSO-like changes during glacial terminations. Therefore, enhanced droughts immediately before the interglacial warming in northern China could be explained, at least in part, by teleconnections in tropical ocean-atmosphere circulation via shifts in the Intertropical Convergence Zone (ITCZ) and associated Jet Stream over the Asian continent. According to our microbial biomarker data, these enhanced megadroughts are apparently different, both in terms of severity and causal mechanism, from the more commonly discussed dry conditions observed during glacial periods.

  5. Dual patch voltage clamp study of low membrane resistance astrocytes in situ.

    PubMed

    Ma, Baofeng; Xu, Guangjin; Wang, Wei; Enyeart, John J; Zhou, Min

    2014-03-17

    Whole-cell patch clamp recording has been successfully used in identifying the voltage-dependent gating and conductance properties of ion channels in a variety of cells. However, this powerful technique is of limited value in studying low membrane resistance cells, such as astrocytes in situ, because of the inability to control or accurately measure the real amplitude of command voltages. To facilitate the study of ionic conductances of astrocytes, we have developed a dual patch recording method which permits membrane current and membrane potential to be simultaneously recorded from astrocytes in spite of their extraordinarily low membrane resistance. The utility of this technique is demonstrated by measuring the voltage-dependent activation of the inwardly rectifying K+ current abundantly expressed in astrocytes and multiple ionic events associated with astrocytic GABAA receptor activation. This protocol can be performed routinely in the study of astrocytes. This method will be valuable for identifying and characterizing the individual ion channels that orchestrate the electrical activity of low membrane resistance cells.

  6. Extraction of AE events to estimate their b values under a triaxial compressive condition Examination using continuous broadband records

    NASA Astrophysics Data System (ADS)

    Yoneda, N.; Kawakata, H.; Hirano, S.; Yoshimitsu, N.; Takahashi, N.

    2017-12-01

    Seismic b values estimated in previous laboratory compressive tests had been utilized for natural earthquake studies. Randomly sampled enough number of events over a wide magnitude range are essential for accurate b value estimation. In former triaxial tests, PZTs had sensitivity only in a narrow frequency range. In addition, the recording system could not extract all signals because of mask times or threshold setting. Recently, Yoshimitsu et al. (2014) enabled to use broadband transducers under triaxial conditions and achieved to acquire waveforms continuously in several hours. With such a system, they estimated the seismic moment of AE at very small magnitude scale. We expected that their continuous broadband recording system made it possible to record much more AE with a wider magnitude range for credible b value estimation in a laboratory. In this study, we performed a compressive test under a higher confining pressure as an updated experiment of Yoshimitsu et al. (2014) and extracted an enough amount of AE. We prepared an intact cylindrical Westerly Granite sample, 100 mm long by 50 mm in diameter. We conducted a triaxial compressive test under a confining pressure of 50 MPa, at a room temperature with drying conditions. Seven broadband transducers (sensitive range; 100 kHz - 1,000 kHz) were located in different height, respectively. Besides, a PZT was mounted to transmit elastic waves for velocity estimation during the experiment. At first, we increased the confining pressure and then started the loading. We switched the load control method from the axial load control to the circumferential displacement one. After exceeding the peak stress, compressive stress was unloaded with a high speed and the sample was recovered. A potential fault was observed on the recovered sample surface. Waveform recording was continued throughout the test for more than 200 minutes. The result of extracting signals by an STA/LTA ratio method for the waveforms recorded by each transducer, we detected about 2,170,000 signals at the most and about 450,000 at the minimum. Recorded waveforms may also include the elastic waves from the PZT and electrical noises. To find the combination of the signals derived from the same event, we used the largest differences in travel times for all transducer pairs. Finally, we obtained about 450,000 combinations.

  7. Attribution of extreme weather and climate-related events.

    PubMed

    Stott, Peter A; Christidis, Nikolaos; Otto, Friederike E L; Sun, Ying; Vanderlinden, Jean-Paul; van Oldenborgh, Geert Jan; Vautard, Robert; von Storch, Hans; Walton, Peter; Yiou, Pascal; Zwiers, Francis W

    2016-01-01

    Extreme weather and climate-related events occur in a particular place, by definition, infrequently. It is therefore challenging to detect systematic changes in their occurrence given the relative shortness of observational records. However, there is a clear interest from outside the climate science community in the extent to which recent damaging extreme events can be linked to human-induced climate change or natural climate variability. Event attribution studies seek to determine to what extent anthropogenic climate change has altered the probability or magnitude of particular events. They have shown clear evidence for human influence having increased the probability of many extremely warm seasonal temperatures and reduced the probability of extremely cold seasonal temperatures in many parts of the world. The evidence for human influence on the probability of extreme precipitation events, droughts, and storms is more mixed. Although the science of event attribution has developed rapidly in recent years, geographical coverage of events remains patchy and based on the interests and capabilities of individual research groups. The development of operational event attribution would allow a more timely and methodical production of attribution assessments than currently obtained on an ad hoc basis. For event attribution assessments to be most useful, remaining scientific uncertainties need to be robustly assessed and the results clearly communicated. This requires the continuing development of methodologies to assess the reliability of event attribution results and further work to understand the potential utility of event attribution for stakeholder groups and decision makers. WIREs Clim Change 2016, 7:23-41. doi: 10.1002/wcc.380 For further resources related to this article, please visit the WIREs website.

  8. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less

  9. Hydroclimate variability and regional atmospheric circulation over the past 1,350 years reconstructed from Lake Ohau, New Zealand

    NASA Astrophysics Data System (ADS)

    Roop, H. A.; Levy, R. H.; Vandergoes, M.; Dunbar, G. B.; Howarth, J. D.; Lorrey, A.; Phipps, S. J.

    2016-12-01

    Comprehensive understanding of natural climate-system dynamics requires high-resolution paleoclimate records extending beyond the instrumental period. This is particularly the case for the sparsely-instrumented Southern Hemisphere mid-latitudes, where the timing and amplitude of regional and hemispheric-scale climatic events are poorly constrained. Here we present a 1,350-year record of hydroclimatic variability and regional circulation derived from an annually laminated sediment record from Lake Ohau, South Island, New Zealand (44.23°S, 169.85°E). The climate of New Zealand is influenced by climatological patterns originating in both the tropics (e.g. El-Niño-Southern Oscillation, Interdecadal Pacific Oscillation) and the Antarctic (Southern Annular Mode, SAM). Utilizing the annually resolved Lake Ohau hydroclimate record in combination with a tree-ring record of summer temperature from Oroko Swamp, New Zealand (Cook et al., 2002), we generate a circulation index for the Western South Island of New Zealand. This index utilizes the temperature and precipitation anomalies defined by the Regional Climate Regime Classification scheme for New Zealand to assign synoptic scale circulation patterns to 25-year intervals from 900-2000 AD. This circulation index shows significant periods of change, most notably 835 - 985 AD when northerly airflow dominated and from 1385 - 1710 AD when strong southerly airflow persisted. Comparisons with regional SAM and ENSO reconstructions show that dry, warm conditions at Lake Ohau are consistently associated with strengthened tropical teleconnections to New Zealand and a positive SAM, while cold and wet conditions are driven by increased southerly airflow and negative phase SAM. A persistent negative SAM dominates the Little Ice Age (LIA; 1385-1710 AD) interval in the Western South Island. This same period coincides with the Northern Hemisphere LIA.

  10. Utilizing Chinese Admission Records for MACE Prediction of Acute Coronary Syndrome

    PubMed Central

    Hu, Danqing; Huang, Zhengxing; Chan, Tak-Ming; Dong, Wei; Lu, Xudong; Duan, Huilong

    2016-01-01

    Background: Clinical major adverse cardiovascular event (MACE) prediction of acute coronary syndrome (ACS) is important for a number of applications including physician decision support, quality of care assessment, and efficient healthcare service delivery on ACS patients. Admission records, as typical media to contain clinical information of patients at the early stage of their hospitalizations, provide significant potential to be explored for MACE prediction in a proactive manner. Methods: We propose a hybrid approach for MACE prediction by utilizing a large volume of admission records. Firstly, both a rule-based medical language processing method and a machine learning method (i.e., Conditional Random Fields (CRFs)) are developed to extract essential patient features from unstructured admission records. After that, state-of-the-art supervised machine learning algorithms are applied to construct MACE prediction models from data. Results: We comparatively evaluate the performance of the proposed approach on a real clinical dataset consisting of 2930 ACS patient samples collected from a Chinese hospital. Our best model achieved 72% AUC in MACE prediction. In comparison of the performance between our models and two well-known ACS risk score tools, i.e., GRACE and TIMI, our learned models obtain better performances with a significant margin. Conclusions: Experimental results reveal that our approach can obtain competitive performance in MACE prediction. The comparison of classifiers indicates the proposed approach has a competitive generality with datasets extracted by different feature extraction methods. Furthermore, our MACE prediction model obtained a significant improvement by comparison with both GRACE and TIMI. It indicates that using admission records can effectively provide MACE prediction service for ACS patients at the early stage of their hospitalizations. PMID:27649220

  11. The timing, two-pulsed nature, and variable climatic expression of the 4.2 ka event: A review and new high-resolution stalagmite data from Namibia

    NASA Astrophysics Data System (ADS)

    Railsback, L. Bruce; Liang, Fuyuan; Brook, G. A.; Voarintsoa, Ny Riavo G.; Sletten, Hillary R.; Marais, Eugene; Hardt, Ben; Cheng, Hai; Edwards, R. Lawrence

    2018-04-01

    The climatic event between 4.2 and 3.9 ka BP known as the "4.2 ka event" is commonly considered to be a synchronous global drought that happened as one pulse. However, careful comparison of records from around the world shows that synchrony is possible only if the published chronologies of the various records are shifted to the extent allowed by the uncertainties of their age data, that several records suggest a two-pulsed event, and that some records suggest a wet rather than dry event. The radiometric ages constraining those records have uncertainties of several decades if not hundreds of years, and in some records the event is represented by only one or two analyses. This paper reports a new record from Stalagmite DP1 from northeastern Namibia in which high 230Th/232Th activity ratios allow small age uncertainties ranging between only 10-28 years, and the event is documented by more than 35 isotopic analyses and by petrographic observation of a surface of dissolution. The ages from Stalagmite DP1 combine with results from 11 other records from around the world to suggest an event centered at about 4.07 ka BP with bracketing ages of 4.15 to 3.93 ka BP. The isotopic and petrographic results suggest a two-pulsed wet event in northeastern Namibia, which is in the Southern Hemisphere's summer rainfall zone where more rain presumably fell with southward migration of the Inter-Tropical Convergence Zone as the result of cooling in the Northern Hemisphere. Comparison with other records from outside the region of dryness from the Mediterranean to eastern Asia suggests that multiple climatic zones similarly moved southward during the event, in some cases bringing wetter conditions that contradict the notion of global drought.

  12. Insertable cardiac event recorder in detection of atrial fibrillation after cryptogenic stroke: an audit report.

    PubMed

    Etgen, Thorleif; Hochreiter, Manfred; Mundel, Markus; Freudenberger, Thomas

    2013-07-01

    Atrial fibrillation (AF) is the most frequent risk factor in ischemic stroke but often remains undetected. We analyzed the value of insertable cardiac event recorder in detection of AF in a 1-year cohort of patients with cryptogenic ischemic stroke. All patients with cryptogenic stroke and eligibility for oral anticoagulation were offered the insertion of a cardiac event recorder. Regular follow-up for 1 year recorded the incidence of AF. Of the 393 patients with ischemic stroke, 65 (16.5%) had a cryptogenic stroke, and in 22 eligible patients, an event recorder was inserted. After 1 year, in 6 of 22 patients (27.3%), AF was detected. These preliminary data show that insertion of cardiac event recorder was eligible in approximately one third of patients with cryptogenic stroke and detected in approximately one quarter of these patients new AF.

  13. Tephra studies and the reconstruction of Middle-to-Upper Paleolithic cultural trajectories

    NASA Astrophysics Data System (ADS)

    d'Errico, Francesco; Banks, William E.

    2015-06-01

    This study describes an approach which combines tephra records with archaeological and contextual data in order to propose best fit scenarios for past cultural changes and population events. With this goal in mind, we critically examine the environmental, archaeological, anthropological, and chronometric records of the Middle-to-Upper Paleolithic (MUP) Transition (45-35 ka) in Europe and identify a number of shortcomings that make it difficult to correlate and interpret current evidence with respect to historical processes. The utility and limitations of tephra records are highlighted and an heuristic strategy, designed to merge evidence from tephra and other proxies, is described. Such a strategy is used to explore the stratigraphic and chronological relationship between the Campanian Ignimbrite (CI) eruption and the cultural changes that occurred during the MUP Transition in Southern Europe. Uncertainties pertaining to the timing of this volcanic event are discussed before summarizing the stratigraphic and cultural sequences of the eleven archaeological sites (Haua Fteah, Kozarnica, Franchthi Cave, Klissoura, Golema Pesht, Cavallo, Serino, Castelcivita, Tabula Traiana, Temnata, Kostenki 14) where the CI tephra has been reliably identified, along with three sites (Uluzzo, Uluzzo C, Bernardini) where such an identification remains tentative. We conclude that if one discards as inconclusive the recent attribution of the Uluzzian to modern humans, the best fit historical scenario that stems from a critical reading of the evidence identifies the Uluzzian as the result of in-situ cultural evolution of late Mousterian populations in this region of Southern Europe. Such evolution, which entails the independent development of cultural innovations typically found in subsequent cultures of the Upper Paleolithic, would have been truncated, before the CI event, by the arrival of modern or Neanderthal-modern hybrid populations bearing the Proto-Aurignacian material culture.

  14. High density event-related potential data acquisition in cognitive neuroscience.

    PubMed

    Slotnick, Scott D

    2010-04-16

    Functional magnetic resonance imaging (fMRI) is currently the standard method of evaluating brain function in the field of Cognitive Neuroscience, in part because fMRI data acquisition and analysis techniques are readily available. Because fMRI has excellent spatial resolution but poor temporal resolution, this method can only be used to identify the spatial location of brain activity associated with a given cognitive process (and reveals virtually nothing about the time course of brain activity). By contrast, event-related potential (ERP) recording, a method that is used much less frequently than fMRI, has excellent temporal resolution and thus can track rapid temporal modulations in neural activity. Unfortunately, ERPs are under utilized in Cognitive Neuroscience because data acquisition techniques are not readily available and low density ERP recording has poor spatial resolution. In an effort to foster the increased use of ERPs in Cognitive Neuroscience, the present article details key techniques involved in high density ERP data acquisition. Critically, high density ERPs offer the promise of excellent temporal resolution and good spatial resolution (or excellent spatial resolution if coupled with fMRI), which is necessary to capture the spatial-temporal dynamics of human brain function.

  15. Characteristics of Recent Tsunamis

    NASA Astrophysics Data System (ADS)

    Sweeney, A. D.; Eble, M. C.; Mungov, G.

    2017-12-01

    How long do tsunamis impact a coast? How often is the largest tsunami wave the first to arrive? How do measurements in the far field differ from those made close to the source? Extending the study of Eblé et al. (2015) who showed the prevalence of a leading negative phase, we assimilate and summarize characteristics of known tsunami events recorded on bottom pressure and coastal water level stations throughout the world oceans to answer these and other questions. An extensive repository of data from the National Centers for Environmental Information (NCEI) archive for tsunami-ready U.S. tide gauge stations, housing more than 200 sites going back 10 years are utilized as are some of the more 3000 marigrams (analog or paper tide gauge records) for tsunami events. The focus of our study is on five tsunamis generated by earthquakes: 2010 Chile (Maule), 2011 East Japan (Tohoku), 2012 Haida Gwaii, 2014 Chile (Iquique), and 2015 Central Chile and one meteorologically generated tsunami on June 2013 along the U.S. East Coast and Caribbean. Reference: Eblé, M., Mungov, G. & Rabinovich, A. On the Leading Negative Phase of Major 2010-2014 Tsunamis. Pure Appl. Geophys. (2015) 172: 3493. https://doi.org/10.1007/s00024-015-1127-5

  16. Millimeter-scale epileptiform spike propagation patterns and their relationship to seizures

    PubMed Central

    Vanleer, Ann C; Blanco, Justin A; Wagenaar, Joost B; Viventi, Jonathan; Contreras, Diego; Litt, Brian

    2016-01-01

    Objective Current mapping of epileptic networks in patients prior to epilepsy surgery utilizes electrode arrays with sparse spatial sampling (∼1.0 cm inter-electrode spacing). Recent research demonstrates that sub-millimeter, cortical-column-scale domains have a role in seizure generation that may be clinically significant. We use high-resolution, active, flexible surface electrode arrays with 500 μm inter-electrode spacing to explore epileptiform local field potential spike propagation patterns in two dimensions recorded from subdural micro-electrocorticographic signals in vivo in cat. In this study, we aimed to develop methods to quantitatively characterize the spatiotemporal dynamics of epileptiform activity at high-resolution. Approach We topically administered a GABA-antagonist, picrotoxin, to induce acute neocortical epileptiform activity leading up to discrete electrographic seizures. We extracted features from local field potential spikes to characterize spatiotemporal patterns in these events. We then tested the hypothesis that two dimensional spike patterns during seizures were different from those between seizures. Main results We showed that spatially correlated events can be used to distinguish ictal versus interictal spikes. Significance We conclude that sub-millimeter-scale spatiotemporal spike patterns reveal network dynamics that are invisible to standard clinical recordings and contain information related to seizure-state. PMID:26859260

  17. Millimeter-scale epileptiform spike propagation patterns and their relationship to seizures

    NASA Astrophysics Data System (ADS)

    Vanleer, Ann C.; Blanco, Justin A.; Wagenaar, Joost B.; Viventi, Jonathan; Contreras, Diego; Litt, Brian

    2016-04-01

    Objective. Current mapping of epileptic networks in patients prior to epilepsy surgery utilizes electrode arrays with sparse spatial sampling (∼1.0 cm inter-electrode spacing). Recent research demonstrates that sub-millimeter, cortical-column-scale domains have a role in seizure generation that may be clinically significant. We use high-resolution, active, flexible surface electrode arrays with 500 μm inter-electrode spacing to explore epileptiform local field potential (LFP) spike propagation patterns in two dimensions recorded from subdural micro-electrocorticographic signals in vivo in cat. In this study, we aimed to develop methods to quantitatively characterize the spatiotemporal dynamics of epileptiform activity at high-resolution. Approach. We topically administered a GABA-antagonist, picrotoxin, to induce acute neocortical epileptiform activity leading up to discrete electrographic seizures. We extracted features from LFP spikes to characterize spatiotemporal patterns in these events. We then tested the hypothesis that two-dimensional spike patterns during seizures were different from those between seizures. Main results. We showed that spatially correlated events can be used to distinguish ictal versus interictal spikes. Significance. We conclude that sub-millimeter-scale spatiotemporal spike patterns reveal network dynamics that are invisible to standard clinical recordings and contain information related to seizure-state.

  18. An Intracratonic Record of North American Tectonics

    NASA Astrophysics Data System (ADS)

    Lovell, Thomas Rudolph

    Investigating how continents change throughout geologic time provides insight into the underlying plate tectonic process that shapes our world. Researchers aiming to understand plate tectonics typically investigate records exposed at plate margins, as these areas contain direct structural and stratigraphic information relating to tectonic plate interaction. However, these margins are also susceptible to destruction, as orogenic processes tend to punctuate records of plate tectonics. In contrast, intracratonic basins are long-lived depressions located inside cratons, shielded from the destructive forces associated with the plate tectonic process. The ability of cratonic basins to preserve sedimentological records for extended periods of geologic time makes them candidates for recording long term changes in continents driven by tectonics and eustacy. This research utilizes an intracratonic basin to better understand how the North American continent has changed throughout Phanerozoic time. This research resolves geochronologic, thermochronologic, and sedimentologic changes throughout Phanerozoic time (>500 Ma) within the intracratonic Illinois Basin detrital record. Core and outcrop sampling provide the bulk of material upon which detrital zircon geochronologic, detrital apatite thermochronologic, and thin section petrographic analyses were performed. Geochronologic evidence presented in Chapters 2 and 3 reveal the Precambrian - Cretaceous strata of the intracratonic Illinois Basin yield three detrital zircon U-Pb age assemblages. Lower Paleozoic strata yield ages corresponding to predominantly cratonic sources (Archean - Mesoproterozoic). In contrast, Middle - Upper Paleozoic strata have a dominant Appalachian orogen (Neoproterozoic - Paleozoic) signal. Cretaceous strata yield similar ages to underlying Upper Paleozoic strata. We conclude that changes in the provenance of Illinois Basin strata result from eustatic events and tectonic forcings. This evidence demonstrates that changes in the detrital record of the Illinois Basin coincide with well-documented, major tectonic and eustatic events that altered and shaped North American plate margins. Chapter 4 presents 24 apatite (U-Th)/He (AHe) ages (3 - 423 Ma) taken from subsurface Cambrian and Pennsylvanian sandstones in the Illinois Basin. Time-temperature simulations used to reproduce these ages predict a basin thermal history with a maximum temperature of 170°C in post-Pennsylvanian time followed by Mesozoic cooling at 0.3°C/Myr. These thermal simulations suggest 3 km of additional post-Pennsylvanian burial (assuming 30°C/km geotherm) followed by subsequent Mesozoic - Cenozoic removal. This burial-exhumation history is concurrent with Late Mesozoic tectoniceustatic fluctuations, including Atlantic and Gulf of Mexico opening, rejuvenation of the Appalachian region, and Gulf of Mexico sediment influx, and the Cretaceous high sea level stand. The Geochronologic and thermochronologic evidence presented in the following chapters suggests the Illinois Basin potentially contains a more robust record of North American tectonics than previously thought. These observations provide a new perspective on the utility of intracratonic basins in understanding long term changes to continental bodies.

  19. Rupture directivity of microseismic events recorded during hydraulic fracture stimulations.

    NASA Astrophysics Data System (ADS)

    Urbancic, T.; Smith-Boughner, L.; Baig, A.; Viegas, G.

    2016-12-01

    We model the dynamics of a complex rupture sequence with four sub-events. These events were recorded during hydraulic fracture stimulations in a gas-bearing shale formation. With force-balance accelerometers, 4.5Hz and 15Hz instruments recording the failure history, we study the directivity of the entire rupture sequence and each sub-event. Two models are considered: unilateral and bi-lateral failures of penny shaped cracks. From the seismic moment tensors of these sub-events, we consider different potential failure planes and rupture directions. Using numerical wave-propagation codes, we generate synthetic rupture sequences with both unilateral and bi-lateral ruptures. These are compared to the four sub-events to determine the directionality of the observed failures and the sensitivity of our recording bandwidth and geometry to distinguishing between different rupture processes. The frequency of unilateral and bilateral rupture processes throughout the fracture stimulation is estimated by comparing the directivity characteristics of the modeled sub-events to other high-quality microseismic events recorded during the same stimulation program. Understanding the failure processes of these microseismic events can provide great insight into the changes in the rock mass responsible for these complex rupture processes.

  20. Synchro-ballistic recording of detonation phenomena

    NASA Astrophysics Data System (ADS)

    Critchfield, Robert R.; Asay, Blaine W.; Bdzil, John B.; Davis, William C.; Ferm, Eric N.; Idar, Deanne J.

    1997-12-01

    Synchro-ballistic use of rotating-mirror streak cameras allows for detailed recording of high-speed events of known velocity and direction. After an introduction to the synchro-ballistic technique, this paper details two diverse applications of the technique as applied in the field of high-explosives research. In the first series of experiments detonation-front shape is recorded as the arriving detonation shock wave tilts an obliquely mounted mirror, causing reflected light to be deflected from the imaging lens. These tests were conducted for the purpose of calibrating and confirming the asymptotic detonation shock dynamics (DSD) theory of Bdzil and Stewart. The phase velocities of the events range from ten to thirty millimeters per microsecond. Optical magnification is set for optimal use of the film's spatial dimension and the phase velocity is adjusted to provide synchronization at the camera's maximum writing speed. Initial calibration of the technique is undertaken using a cylindrical HE geometry over a range of charge diameters and of sufficient length-to- diameter ratio to insure a stable detonation wave. The final experiment utilizes an arc-shaped explosive charge, resulting in an asymmetric denotation-front record. The second series of experiments consists of photographing a shaped-charge jet having a velocity range of two to nine millimeters per microsecond. To accommodate the range of velocities it is necessary to fire several tests, each synchronized to a different section of the jet. The experimental apparatus consists of a vacuum chamber to preclude atmospheric ablation of the jet tip with shocked-argon back lighting to produce a shadow-graph image.

  1. Synchro-ballistic recording of detonation phenomena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchfield, R.R.; Asay, B.W.; Bdzil, J.B.

    1997-09-01

    Synchro-ballistic use of rotating-mirror streak cameras allows for detailed recording of high-speed events of known velocity and direction. After an introduction to the synchro-ballistic technique, this paper details two diverse applications of the technique as applied in the field of high-explosives research. In the first series of experiments detonation-front shape is recorded as the arriving detonation shock wave tilts an obliquely mounted mirror, causing reflected light to be deflected from the imaging lens. These tests were conducted for the purpose of calibrating and confirming the asymptotic Detonation Shock Dynamics (DSD) theory of Bdzil and Stewart. The phase velocities of themore » events range from ten to thirty millimeters per microsecond. Optical magnification is set for optimal use of the film`s spatial dimension and the phase velocity is adjusted to provide synchronization at the camera`s maximum writing speed. Initial calibration of the technique is undertaken using a cylindrical HE geometry over a range of charge diameters and of sufficient length-to-diameter ratio to insure a stable detonation wave. The final experiment utilizes an arc-shaped explosive charge, resulting in an asymmetric detonation-front record. The second series of experiments consists of photographing a shaped-charge jet having a velocity range of two to nine millimeters per microsecond. To accommodate the range of velocities it is necessary to fire several tests, each synchronized to a different section of the jet. The experimental apparatus consists of a vacuum chamber to preclude atmospheric ablation of the jet tip with shocked-argon back lighting to produce a shadow-graph image.« less

  2. Late Eocene impact events recorded in deep-sea sediments

    NASA Technical Reports Server (NTRS)

    Glass, B. P.

    1988-01-01

    Raup and Sepkoski proposed that mass extinctions have occurred every 26 Myr during the last 250 Myr. In order to explain this 26 Myr periodicity, it was proposed that the mass extinctions were caused by periodic increases in cometary impacts. One method to test this hypothesis is to determine if there were periodic increases in impact events (based on crater ages) that correlate with mass extinctions. A way to test the hypothesis that mass extinctions were caused by periodic increases in impact cratering is to look for evidence of impact events in deep-sea deposits. This method allows direct observation of the temporal relationship between impact events and extinctions as recorded in the sedimentary record. There is evidence in the deep-sea record for two (possibly three) impact events in the late Eocene. The younger event, represented by the North American microtektite layer, is not associated with an Ir anomaly. The older event, defined by the cpx spherule layer, is associated with an Ir anomaly. However, neither of the two impact events recorded in late Eocene deposits appears to be associated with an unusual number of extinctions. Thus there is little evidence in the deep-sea record for an impact-related mass extinction in the late Eocene.

  3. Reclaiming the past: Using hierarchical Bayesian analysis to fill missing values in the tide gauge mean sea level record, with application to extreme value analysis

    NASA Astrophysics Data System (ADS)

    Piecuch, C. G.; Huybers, P. J.; Tingley, M.

    2015-12-01

    Tide gauge records of mean sea level are some of the most valuable instrumental time series of oceanic variability and change. Yet these time series sometimes have short record lengths and intermittently missing values. Such issues can limit the utility of the data, for example, precluding rigorous analyses of return periods of extreme mean sea level events and whether they are unprecedented. With a view to filling gaps in the tide gauge mean sea level time series, we describe a hierarchical Bayesian modeling approach. The model, which is predicated on the notion of conditional probabilities, comprises three levels: a process level, which casts mean sea level as a field with spatiotemporal covariance; a data level, which represents tide gauge observations as noisy, biased versions of the true process; and a prior level, which gives prior functional forms to model parameters. Using Bayes' rule, this technique gives estimates of the posterior probability of the process and the parameters given the observations. To demonstrate the approach, we apply it to 2,967 station-years of annual mean sea level observations over 1856-2013 from 70 tide gauges along the United States East Coast from Florida to Maine (i.e., 26.8% record completeness). The model overcomes the data paucity by sharing information across space and time. The result is an ensemble of realizations, each member of which is a possible history of sea level changes at these locations over this period, which is consistent with and equally likely given the tide gauge data and underlying model assumptions. Using the ensemble of histories furnished by the Bayesian model, we identify extreme events of mean sea level change in the tide gauge time series. Specifically, we use the model to address the particular hypothesis (with rigorous uncertainty quantification) that a recently reported interannual sea level rise during 2008-2010 was unprecedented in the instrumental record along the northeast coast of North America, and that it had a return period of 850 years. Preliminary analysis suggests that this event was likely unprecedented on the coast of Maine in the last century.

  4. A diary after dinner: How the time of event recording influences later accessibility of diary events.

    PubMed

    Szőllősi, Ágnes; Keresztes, Attila; Conway, Martin A; Racsmány, Mihály

    2015-01-01

    Recording the events of a day in a diary may help improve their later accessibility. An interesting question is whether improvements in long-term accessibility will be greater if the diary is completed at the end of the day, or after a period of sleep, the following morning. We investigated this question using an internet-based diary method. On each of five days, participants (n = 109) recorded autobiographical memories for that day or for the previous day. Recording took place either in the morning or in the evening. Following a 30-day retention interval, the diary events were free recalled. We found that participants who recorded their memories in the evening before sleep had best memory performance. These results suggest that the time of reactivation and recording of recent autobiographical events has a significant effect on the later accessibility of those diary events. We discuss our results in the light of related findings that show a beneficial effect of reduced interference during sleep on memory consolidation and reconsolidation.

  5. Assignment of adverse event indexing terms in randomized clinical trials involving spinal manipulative therapy: an audit of records in MEDLINE and EMBASE databases.

    PubMed

    Gorrell, Lindsay M; Engel, Roger M; Lystad, Reidar P; Brown, Benjamin T

    2017-03-14

    Reporting of adverse events in randomized clinical trials (RCTs) is encouraged by the authors of The Consolidated Standards of Reporting Trials (CONSORT) statement. With robust methodological design and adequate reporting, RCTs have the potential to provide useful evidence on the incidence of adverse events associated with spinal manipulative therapy (SMT). During a previous investigation, it became apparent that comprehensive search strategies combining text words with indexing terms was not sufficiently sensitive for retrieving records that were known to contain reports on adverse events. The aim of this analysis was to compare the proportion of articles containing data on adverse events associated with SMT that were indexed in MEDLINE and/or EMBASE and the proportion of those that included adverse event-related words in their title or abstract. A sample of 140 RCT articles previously identified as containing data on adverse events associated with SMT was used. Articles were checked to determine if: (1) they had been indexed with relevant terms describing adverse events in the MEDLINE and EMBASE databases; and (2) they mentioned adverse events (or any related terms) in the title or abstract. Of the 140 papers, 91% were MEDLINE records, 85% were EMBASE records, 81% were found in both MEDLINE and EMBASE records, and 4% were not in either database. Only 19% mentioned adverse event-related text words in the title or abstract. There was no significant difference between MEDLINE and EMBASE records in the proportion of available papers (p = 0.078). Of the 113 papers that were found in both MEDLINE and EMBASE records, only 3% had adverse event-related indexing terms assigned to them in both databases, while 81% were not assigned an adverse event-related indexing term in either database. While there was effective indexing of RCTs involving SMT in the MEDLINE and EMBASE databases, there was a failure of allocation of adverse event indexing terms in both databases. We recommend the development of standardized definitions and reporting tools for adverse events associated with SMT. Adequate reporting of adverse events associated with SMT will facilitate accurate indexing of these types of manuscripts in the databases.

  6. Characterizing Mega-Earthquake Related Tsunami on Subduction Zones without Large Historical Events

    NASA Astrophysics Data System (ADS)

    Williams, C. R.; Lee, R.; Astill, S.; Farahani, R.; Wilson, P. S.; Mohammed, F.

    2014-12-01

    Due to recent large tsunami events (e.g., Chile 2010 and Japan 2011), the insurance industry is very aware of the importance of managing its exposure to tsunami risk. There are currently few tools available to help establish policies for managing and pricing tsunami risk globally. As a starting point and to help address this issue, Risk Management Solutions Inc. (RMS) is developing a global suite of tsunami inundation footprints. This dataset will include both representations of historical events as well as a series of M9 scenarios on subductions zones that have not historical generated mega earthquakes. The latter set is included to address concerns about the completeness of the historical record for mega earthquakes. This concern stems from the fact that the Tohoku Japan earthquake was considerably larger than had been observed in the historical record. Characterizing the source and rupture pattern for the subduction zones without historical events is a poorly constrained process. In many case, the subduction zones can be segmented based on changes in the characteristics of the subducting slab or major ridge systems. For this project, the unit sources from the NOAA propagation database are utilized to leverage the basin wide modeling included in this dataset. The length of the rupture is characterized based on subduction zone segmentation and the slip per unit source can be determined based on the event magnitude (i.e., M9) and moment balancing. As these events have not occurred historically, there is little to constrain the slip distribution. Sensitivity tests on the potential rupture pattern have been undertaken comparing uniform slip to higher shallow slip and tapered slip models. Subduction zones examined include the Makran Trench, the Lesser Antilles and the Hikurangi Trench. The ultimate goal is to create a series of tsunami footprints to help insurers understand their exposures at risk to tsunami inundation around the world.

  7. Adverse events in British hospitals: preliminary retrospective record review

    PubMed Central

    Vincent, Charles; Neale, Graham; Woloshynowych, Maria

    2001-01-01

    Objectives To examine the feasibility of detecting adverse events through record review in British hospitals and to make preliminary estimates of the incidence and costs of adverse events. Design Retrospective review of 1014 medical and nursing records. Setting Two acute hospitals in Greater London area. Main outcome measure Number of adverse events. Results 110 (10.8%) patients experienced an adverse event, with an overall rate of adverse events of 11.7% when multiple adverse events were included. About half of these events were judged preventable with ordinary standards of care. A third of adverse events led to moderate or greater disability or death. Conclusions These results suggest that adverse events are a serious source of harm to patients and a large drain on NHS resources. Some are major events; others are frequent, minor events that go unnoticed in routine clinical care but together have massive economic consequences. PMID:11230064

  8. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  9. 76 FR 21791 - Parts and Accessories Necessary for Safe Operation; Exemption Renewal for DriveCam, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ... video event recorders by May 18, 2011. The Agency will evaluate any data submitted and, if adverse... the placement of video event recorders at the top of the windshields on commercial motor vehicles (CMVs). CMVs may continue to use the video event recorders to increase safety through (1) identification...

  10. Spatio-temporal evolution of the 2011 Prague, Oklahoma aftershock sequence revealed using subspace detection and relocation

    USGS Publications Warehouse

    McMahon, Nicole D; Aster, Richard C.; Yeck, William; McNamara, Daniel E.; Benz, Harley M.

    2017-01-01

    The 6 November 2011 Mw 5.7 earthquake near Prague, Oklahoma is the second largest earthquake ever recorded in the state. A Mw 4.8 foreshock and the Mw 5.7 mainshock triggered a prolific aftershock sequence. Utilizing a subspace detection method, we increase by fivefold the number of precisely located events between 4 November and 5 December 2011. We find that while most aftershock energy is released in the crystalline basement, a significant number of the events occur in the overlying Arbuckle Group, indicating that active Meeker-Prague faulting extends into the sedimentary zone of wastewater disposal. Although the number of aftershocks in the Arbuckle Group is large, comprising ~40% of the aftershock catalog, the moment contribution of Arbuckle Group earthquakes is much less than 1% of the total aftershock moment budget. Aftershock locations are sparse in patches that experienced large slip during the mainshock.

  11. Increased cortical extracellular adenosine correlates with seizure termination.

    PubMed

    Van Gompel, Jamie J; Bower, Mark R; Worrell, Gregory A; Stead, Matt; Chang, Su-Youne; Goerss, Stephan J; Kim, Inyong; Bennet, Kevin E; Meyer, Fredric B; Marsh, W Richard; Blaha, Charles D; Lee, Kendall H

    2014-02-01

    Seizures are currently defined by their electrographic features. However, neuronal networks are intrinsically dependent on neurotransmitters of which little is known regarding their periictal dynamics. Evidence supports adenosine as having a prominent role in seizure termination, as its administration can terminate and reduce seizures in animal models. Furthermore, microdialysis studies in humans suggest that adenosine is elevated periictally, but the relationship to the seizure is obscured by its temporal measurement limitations. Because electrochemical techniques can provide vastly superior temporal resolution, we test the hypothesis that extracellular adenosine concentrations rise during seizure termination in an animal model and humans using electrochemistry. White farm swine (n = 45) were used in an acute cortical model of epilepsy, and 10 human epilepsy patients were studied during intraoperative electrocorticography (ECoG). Wireless Instantaneous Neurotransmitter Concentration Sensor (WINCS)-based fast scan cyclic voltammetry (FSCV) and fixed potential amperometry were obtained utilizing an adenosine-specific triangular waveform or biosensors, respectively. Simultaneous ECoG and electrochemistry demonstrated an average adenosine increase of 260% compared to baseline, at 7.5 ± 16.9 s with amperometry (n = 75 events) and 2.6 ± 11.2 s with FSCV (n = 15 events) prior to electrographic seizure termination. In agreement with these animal data, adenosine elevation prior to seizure termination in a human patient utilizing FSCV was also seen. Simultaneous ECoG and electrochemical recording supports the hypothesis that adenosine rises prior to seizure termination, suggesting that adenosine itself may be responsible for seizure termination. Future work using intraoperative WINCS-based FSCV recording may help to elucidate the precise relationship between adenosine and seizure termination. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.

  12. Site term from single-station sigma analysis of S-waves in western Turkey

    NASA Astrophysics Data System (ADS)

    Akyol, Nihal

    2018-05-01

    The main aim of this study is to obtain site terms from single-station sigma analysis and to compare them with the site functions resulting from different techniques. The dataset consists of 1764 records from 322 micro- and moderate-size local earthquakes recorded by 29 stations in western Turkey. Median models were derived from S-wave Fourier amplitude spectra for selected 22 frequencies, by utilizing the MLR procedure which performs the maximum likelihood (ML) estimation of mixed models where the fixed effects are treated as random (R) effects with infinite variance. At this stage, b (geometrical spreading coefficient) and Q (quality factor) values were decomposed, simultaneously. The residuals of the median models were examined by utilizing the single-station sigma analysis to obtain the site terms of 29 stations. Sigma for the median models is about 0.422 log10 units and decreases to about 0.308, when the site terms from the single-station sigma analysis were considered (27% reduction). The event-corrected within-event standard deviations for each frequency are rather stable, in the range 0.19-0.23 log10 units with an average value of 0.20 (± 0.01). The site terms from single-station sigma analysis were compared with the site function estimates from the horizontal-to-vertical-spectral-ratio (HVSR) and generalized inversion (INV) techniques by Akyol et al. (2013) and Kurtulmuş and Akyol (2015), respectively. Consistency was observed between the single-station sigma site terms and the INV site transfer functions. The results imply that the single-station sigma analysis could separate the site terms with respect to the median models.

  13. Does increasing the size of bi-weekly samples of records influence results when using the Global Trigger Tool? An observational study of retrospective record reviews of two different sample sizes.

    PubMed

    Mevik, Kjersti; Griffin, Frances A; Hansen, Tonje E; Deilkås, Ellen T; Vonen, Barthold

    2016-04-25

    To investigate the impact of increasing sample of records reviewed bi-weekly with the Global Trigger Tool method to identify adverse events in hospitalised patients. Retrospective observational study. A Norwegian 524-bed general hospital trust. 1920 medical records selected from 1 January to 31 December 2010. Rate, type and severity of adverse events identified in two different samples sizes of records selected as 10 and 70 records, bi-weekly. In the large sample, 1.45 (95% CI 1.07 to 1.97) times more adverse events per 1000 patient days (39.3 adverse events/1000 patient days) were identified than in the small sample (27.2 adverse events/1000 patient days). Hospital-acquired infections were the most common category of adverse events in both the samples, and the distributions of the other categories of adverse events did not differ significantly between the samples. The distribution of severity level of adverse events did not differ between the samples. The findings suggest that while the distribution of categories and severity are not dependent on the sample size, the rate of adverse events is. Further studies are needed to conclude if the optimal sample size may need to be adjusted based on the hospital size in order to detect a more accurate rate of adverse events. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. Environmental policies: Impact on utility operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puettgen, H.B.; Crooke, E.A.; Anderson, J.

    1996-05-01

    The first of many 1996 IEEE Power Engineering Society Winter Meeting keynote events was the Plenary Session on Environmental Policies: Impact on Utility Operations, which was held on January 22, 1996, in Baltimore, Maryland. Environmental policies have wide-ranging effects on the electric power industry and on the electrical engineering profession. Following an overview of world-wide environmental policies and their impact on the industry by Hans Puettgen, PES Public Affairs chair and session moderator, the guest speakers presented perspectives of the US Department of Energy (DOE), electric power utilities, and one particular utility, respectively: Janet Anderson, special assistant to the USmore » Secretary of Energy for environmental policy; Robert Beck, vice president for environmental affairs at the Edison Electric Institute (EEI); and Edward M. Davis, supervisor of the BGE Environmental Performance Assessments Unit. The audience was encouraged to participate in the Plenary Session by submitting questions, which sparked some open panel discussions following the presentations. This article summarizes the presentations, identifies the topics of discussion during the question and answer (Q and A) session, and provides information on how to obtain a copy of the videotape recording of the Plenary Session for use in PES Chapter activities.« less

  15. Single-shot optical recording with sub-picosecond resolution spans record nanosecond lengths

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, Ryan; Heebner, John

    With the advent of electronics, oscilloscopes and photodiodes are now routinely capable of measuring events well below nanosecond resolution. However, these electronic instruments do not currently measure events below 10 ps resolution. From Walden’s observation that there is an engineering tradeoff between electronic bit depth and temporal resolution in analog-to-digital converters, this technique is projected to have extremely poor fidelity if it is extended to record single events with picosecond resolution. While this constraint may be circumvented with extensive signal averaging or other multiple measurements approaches, rare events and nonrepetitive events cannot be observed with this technique. Techniques capable ofmore » measuring information in a single shot are often required. There is a general lack of available technologies that are easily scalable to long records with sub-picosecond resolution, and are simultaneously versatile in wavelength of operation. Since it is difficult to scale electronic methods to shorter resolutions, we instead aim to scale optical methods to longer records. Demonstrated optical recording methods that have achieved 1 ps resolution and long recording lengths rely on either time scaling to slow down the temporal information or, like Wien, perform time-to-space mapping so that fast events may be captured with a conventional camera.« less

  16. Single-shot optical recording with sub-picosecond resolution spans record nanosecond lengths

    DOE PAGES

    Muir, Ryan; Heebner, John

    2018-01-18

    With the advent of electronics, oscilloscopes and photodiodes are now routinely capable of measuring events well below nanosecond resolution. However, these electronic instruments do not currently measure events below 10 ps resolution. From Walden’s observation that there is an engineering tradeoff between electronic bit depth and temporal resolution in analog-to-digital converters, this technique is projected to have extremely poor fidelity if it is extended to record single events with picosecond resolution. While this constraint may be circumvented with extensive signal averaging or other multiple measurements approaches, rare events and nonrepetitive events cannot be observed with this technique. Techniques capable ofmore » measuring information in a single shot are often required. There is a general lack of available technologies that are easily scalable to long records with sub-picosecond resolution, and are simultaneously versatile in wavelength of operation. Since it is difficult to scale electronic methods to shorter resolutions, we instead aim to scale optical methods to longer records. Demonstrated optical recording methods that have achieved 1 ps resolution and long recording lengths rely on either time scaling to slow down the temporal information or, like Wien, perform time-to-space mapping so that fast events may be captured with a conventional camera.« less

  17. Triggering Events and New Daily Persistent Headache: Age and Gender Differences and Insights on Pathogenesis-A Clinic-Based Study.

    PubMed

    Rozen, Todd D

    2016-01-01

    To define what are the age and gender differences for new daily persistent headache (NDPH) triggering events and how this may relate to the pathogenesis of NDPH. To describe several new triggering events for NDPH. All patients were diagnosed with primary NDPH at a headache specialty clinic during the time period of 01/2009 through 01/2013. This was a retrospective analysis of patient medical records utilizing an electronic medical record system. Ninety-seven patients were diagnosed with primary NDPH (65 women and 32 men). The mean average age of onset was younger in women than men 32.4 years vs 35.8 years. Fifty one of ninety seven NDPH patients (53%) did not recognize a triggering event while an infection or flu-like illness triggered NDPH in 22%, a stressful life event in 9%, a procedure (surgical) in 9%, and some "other" recognized trigger in 7%. All of the NDPH patients who developed new onset headache after an invasive surgical procedure were intubated. There was no significant difference in frequency for any of the triggering events between genders. The youngest age of onset was for a post stressful life event trigger while the oldest age of onset was in the post-surgical subgroup. Women developed NDPH at a younger age of onset for all recognized triggers, but there was no significant difference in ages of onset between the genders. There was no significant difference in the number of NDPH patients who had a history of migraine or no history and if they developed NDPH after any triggered event vs no triggering event. However, the majority of patients who developed NDPH after a stressful life event did have a precedent migraine history (67%). Newly noted triggers include: hormonal manipulation with progesterone, medication exposure, chemical/pesticide exposure, massage treatment, and immediately post a syncopal event. More than 50% of NDPH sufferers do not recognize a triggering event to their headaches. A key finding from the present study is the recognition that of those patients who developed NDPH after an invasive surgical procedure all required intubation and we speculate a cervicogenic origin to their headaches. The fact that both genders had an almost equal rate of occurrence for most NDPH triggers and almost the same age of onset suggests a common underlying pathogenesis for similar triggering events. A precedent history of migraine did not enhance the frequency of triggered vs nontriggered NDPH except possibly for a stressful life event. © 2015 American Headache Society.

  18. An effective noise-suppression technique for surface microseismic data

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Willis, Mark; Haines, Seth S.; Batzle, Mike; Behura, Jyoti; Davidson, Michael

    2013-01-01

    The presence of strong surface-wave noise in surface microseismic data may decrease the utility of these data. We implement a technique, based on the distinct characteristics that microseismic signal and noise show in the τ‐p domain, to suppress surface-wave noise in microseismic data. Because most microseismic source mechanisms are deviatoric, preprocessing is necessary to correct for the nonuniform radiation pattern prior to transforming the data to the τ‐p domain. We employ a scanning approach, similar to semblance analysis, to test all possible double-couple orientations to determine an estimated orientation that best accounts for the polarity pattern of any microseismic events. We then correct the polarity of the data traces according to this pattern, prior to conducting signal-noise separation in the τ‐p domain. We apply our noise-suppression technique to two surface passive-seismic data sets from different acquisition surveys. The first data set includes a synthetic microseismic event added to field passive noise recorded by an areal receiver array distributed over a Barnett Formation reservoir undergoing hydraulic fracturing. The second data set is field microseismic data recorded by receivers arranged in a star-shaped array, over a Bakken Shale reservoir during a hydraulic-fracturing process. Our technique significantly improves the signal-to-noise ratios of the microseismic events and preserves the waveforms at the individual traces. We illustrate that the enhancement in signal-to-noise ratio also results in improved imaging of the microseismic hypocenter.

  19. 78 FR 17750 - Parts and Accessories Necessary for Safe Operation; Exemption Renewal for DriveCam, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-22

    ... of video event recorders at the top of the windshields on commercial motor vehicles (CMVs). Motor carriers may continue to use the video event recorders mounted in the windshield area to increase safety... DriveCam applied for an exemption from 49 CFR 393.60(e)(1) to allow the use of video event recorders on...

  20. Structured vs. Unstructured: Factors Affecting Adverse Drug Reaction Documentation in an EMR Repository

    PubMed Central

    Skentzos, Stephen; Shubina, Maria; Plutzky, Jorge; Turchin, Alexander

    2011-01-01

    Adverse reactions to medications to which the patient was known to be intolerant are common. Electronic decision support can prevent them but only if history of adverse reactions to medications is recorded in structured format. We have conducted a retrospective study of 31,531 patients with adverse reactions to statins documented in the notes, as identified with natural language processing. The software identified statin adverse reactions with sensitivity of 86.5% and precision of 91.9%. Only 9020 of these patients had an adverse reaction to a statin recorded in structured format. In multivariable analysis the strongest predictor of structured documentation was utilization of EMR functionality that integrated the medication list with the structured medication adverse reaction repository (odds ratio 48.6, p < 0.0001). Integration of information flow between EMR modules can help improve documentation and potentially prevent adverse drug events. PMID:22195188

  1. Healthcare Blockchain System Using Smart Contracts for Secure Automated Remote Patient Monitoring.

    PubMed

    Griggs, Kristen N; Ossipova, Olya; Kohlios, Christopher P; Baccarini, Alessandro N; Howson, Emily A; Hayajneh, Thaier

    2018-06-06

    As Internet of Things (IoT) devices and other remote patient monitoring systems increase in popularity, security concerns about the transfer and logging of data transactions arise. In order to handle the protected health information (PHI) generated by these devices, we propose utilizing blockchain-based smart contracts to facilitate secure analysis and management of medical sensors. Using a private blockchain based on the Ethereum protocol, we created a system where the sensors communicate with a smart device that calls smart contracts and writes records of all events on the blockchain. This smart contract system would support real-time patient monitoring and medical interventions by sending notifications to patients and medical professionals, while also maintaining a secure record of who has initiated these activities. This would resolve many security vulnerabilities associated with remote patient monitoring and automate the delivery of notifications to all involved parties in a HIPAA compliant manner.

  2. Final report for the endowment of simulator agents with human-like episodic memory LDRD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speed, Ann Elizabeth; Lippitt, Carl Edward; Thomas, Edward Victor

    This report documents work undertaken to endow the cognitive framework currently under development at Sandia National Laboratories with a human-like memory for specific life episodes. Capabilities have been demonstrated within the context of three separate problem areas. The first year of the project developed a capability whereby simulated robots were able to utilize a record of shared experience to perform surveillance of a building to detect a source of smoke. The second year focused on simulations of social interactions providing a queriable record of interactions such that a time series of events could be constructed and reconstructed. The third yearmore » addressed tools to promote desktop productivity, creating a capability to query episodic logs in real time allowing the model of a user to build on itself based on observations of the user's behavior.« less

  3. Utilizing electronic health records to predict acute kidney injury risk and outcomes: workgroup statements from the 15(th) ADQI Consensus Conference.

    PubMed

    Sutherland, Scott M; Chawla, Lakhmir S; Kane-Gill, Sandra L; Hsu, Raymond K; Kramer, Andrew A; Goldstein, Stuart L; Kellum, John A; Ronco, Claudio; Bagshaw, Sean M

    2016-01-01

    The data contained within the electronic health record (EHR) is "big" from the standpoint of volume, velocity, and variety. These circumstances and the pervasive trend towards EHR adoption have sparked interest in applying big data predictive analytic techniques to EHR data. Acute kidney injury (AKI) is a condition well suited to prediction and risk forecasting; not only does the consensus definition for AKI allow temporal anchoring of events, but no treatments exist once AKI develops, underscoring the importance of early identification and prevention. The Acute Dialysis Quality Initiative (ADQI) convened a group of key opinion leaders and stakeholders to consider how best to approach AKI research and care in the "Big Data" era. This manuscript addresses the core elements of AKI risk prediction and outlines potential pathways and processes. We describe AKI prediction targets, feature selection, model development, and data display.

  4. Errors, near misses and adverse events in the emergency department: what can patients tell us?

    PubMed

    Friedman, Steven M; Provan, David; Moore, Shannon; Hanneman, Kate

    2008-09-01

    We sought to determine whether patients or their families could identify adverse events in the emergency department (ED), to characterize patient reports of errors and to compare patient reports to events recorded by health care providers. This was a prospective cohort study in a quaternary care inner city teaching hospital with approximately 40,000 annual visits. ED patients were recruited for participation in a standardized interview within 24 hours of ED discharge and a follow-up interview 3-7 days after discharge. Responses regarding events were tabulated and compared with physician and nurse notations in the medical record and hospital event reporting system. Of 292 eligible patients, 201 (69%) were interviewed within 24 hours of ED discharge, and 143 (71% of interviewees) underwent a follow-up interview 3-7 days after discharge. Interviewees did not differ from the base ED population in terms of age, sex or language. Analysis of patient interviews identified 10 adverse events (5% incident rate; 95% confidence interval [CI] 2.41%-8.96%), 8 near misses (4% incident rate; 95% CI 1.73%-7.69%) and no medical errors. Of the 10 adverse events, 6 (60%) were characterized as preventable (2 raters; kappa=0.78, standard error [SE] 0.20; 95% CI 0.39-1.00; p=0.01). Adverse events were primarily related to delayed or inadequate analgesia. Only 4 out of 8 (50%) near misses were intercepted by hospital personnel. The secondary interview elicited 2 out of 10 adverse events and 3 out of 8 near misses that had not been identified in the primary interview. No designation (0 out of 10) of an adverse event was recorded in the ED medical record or in the confidential hospital event reporting system. ED patients can identify adverse events affecting their care. Moreover, many of these events are not recorded in the medical record. Engaging patients and their family members in identification of errors may enhance patient safety.

  5. An automated approach towards detecting complex behaviours in deep brain oscillations.

    PubMed

    Mace, Michael; Yousif, Nada; Naushahi, Mohammad; Abdullah-Al-Mamun, Khondaker; Wang, Shouyan; Nandi, Dipankar; Vaidyanathan, Ravi

    2014-03-15

    Extracting event-related potentials (ERPs) from neurological rhythms is of fundamental importance in neuroscience research. Standard ERP techniques typically require the associated ERP waveform to have low variance, be shape and latency invariant and require many repeated trials. Additionally, the non-ERP part of the signal needs to be sampled from an uncorrelated Gaussian process. This limits methods of analysis to quantifying simple behaviours and movements only when multi-trial data-sets are available. We introduce a method for automatically detecting events associated with complex or large-scale behaviours, where the ERP need not conform to the aforementioned requirements. The algorithm is based on the calculation of a detection contour and adaptive threshold. These are combined using logical operations to produce a binary signal indicating the presence (or absence) of an event with the associated detection parameters tuned using a multi-objective genetic algorithm. To validate the proposed methodology, deep brain signals were recorded from implanted electrodes in patients with Parkinson's disease as they participated in a large movement-based behavioural paradigm. The experiment involved bilateral recordings of local field potentials from the sub-thalamic nucleus (STN) and pedunculopontine nucleus (PPN) during an orientation task. After tuning, the algorithm is able to extract events achieving training set sensitivities and specificities of [87.5 ± 6.5, 76.7 ± 12.8, 90.0 ± 4.1] and [92.6 ± 6.3, 86.0 ± 9.0, 29.8 ± 12.3] (mean ± 1 std) for the three subjects, averaged across the four neural sites. Furthermore, the methodology has the potential for utility in real-time applications as only a single-trial ERP is required. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Immediate Adverse Events in Interventional Pain Procedures: A Multi-Institutional Study.

    PubMed

    Carr, Carrie M; Plastaras, Christopher T; Pingree, Matthew J; Smuck, Matthew; Maus, Timothy P; Geske, Jennifer R; El-Yahchouchi, Christine A; McCormick, Zachary L; Kennedy, David J

    2016-12-01

    Interventional procedures directed toward sources of pain in the axial and appendicular musculoskeletal system are performed with increasing frequency. Despite the presence of evidence-based guidelines for such procedures, there are wide variations in practice. Case reports of serious complications such as spinal cord infarction or infection from spine injections lack appropriate context and create a misleading view of the risks of appropriately performed interventional pain procedures. To evaluate adverse event rate for interventional spine procedures performed at three academic interventional spine practices. Quality assurance databases at three academic interventional pain management practices that utilize evidence-based guidelines [1] were interrogated for immediate complications from interventional pain procedures. Review of the electronic medical record verified or refuted the occurrence of a complication. Same-day emergency department transfers or visits were also identified by a records search. Immediate complication data were available for 26,061 consecutive procedures. A radiology practice performed 19,170 epidural steroid (primarily transforaminal), facet, sacroiliac, and trigger point injections (2006-2013). A physiatry practice performed 6,190 spine interventions (2004-2009). A second physiatry practice performed 701 spine procedures (2009-2010). There were no major complications (permanent neurologic deficit or clinically significant bleeding [e.g., epidural hematoma]) with any procedure. Overall complication rate was 1.9% (493/26,061). Vasovagal reactions were the most frequent event (1.1%). Nineteen patients (<0.1%) were transferred to emergency departments for: allergic reactions, chest pain, symptomatic hypertension, and a vasovagal reaction. This study demonstrates that interventional pain procedures are safely performed with extremely low immediate adverse event rates when evidence-based guidelines are observed. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Smartphone apps for snoring.

    PubMed

    Camacho, M; Robertson, M; Abdullatif, J; Certal, V; Kram, Y A; Ruoff, C M; Brietzke, S E; Capasso, R

    2015-10-01

    To identify and systematically evaluate user-friendly smartphone snoring apps. The Apple iTunes app store was searched for snoring apps that allow recording and playback. Snoring apps were downloaded, evaluated and rated independently by four authors. Two patients underwent polysomnography, and the data were compared with simultaneous snoring app recordings, and one patient used the snoring app at home. Of 126 snoring apps, 13 met the inclusion and exclusion criteria. The most critical app feature was the ability to graphically display the snoring events. The Quit Snoring app received the highest overall rating. When this app's recordings were compared with in-laboratory polysomnography data, app snoring sensitivities ranged from 64 to 96 per cent, and snoring positive predictive values ranged from 93 to 96 per cent. A chronic snorer used the app nightly for one month and tracked medical interventions. Snoring decreased from 200 to 10 snores per hour, and bed partner snoring complaint scores decreased from 9 to 2 (on a 0-10 scale). Select smartphone apps are user-friendly for recording and playing back snoring sounds. Preliminary comparison of more than 1500 individual snores demonstrates the potential clinical utility of such apps; however, further validation testing is recommended.

  8. Diagnostic yield and optimal duration of continuous-loop event monitoring for the diagnosis of palpitations. A cost-effectiveness analysis

    NASA Technical Reports Server (NTRS)

    Zimetbaum, P. J.; Kim, K. Y.; Josephson, M. E.; Goldberger, A. L.; Cohen, D. J.

    1998-01-01

    BACKGROUND: Continuous-loop event recorders are widely used for the evaluation of palpitations, but the optimal duration of monitoring is unknown. OBJECTIVE: To determine the yield, timing, and incremental cost-effectiveness of each week of event monitoring for palpitations. DESIGN: Prospective cohort study. PATIENTS: 105 consecutive outpatients referred for the placement of a continuous-loop event recorder for the evaluation of palpitations. MEASUREMENTS: Diagnostic yield, incremental cost, and cost-effectiveness for each week of monitoring. RESULTS: The diagnostic yield of continuous-loop event recorders was 1.04 diagnoses per patient in week 1, 0.15 diagnoses per patient in week 2, and 0.01 diagnoses per patient in week 3 and beyond. Over time, the cost-effectiveness ratio increased from $98 per new diagnosis in week 1 to $576 per new diagnosis in week 2 and $5832 per new diagnosis in week 3. CONCLUSIONS: In patients referred for evaluation of palpitations, the diagnostic yield of continuous-loop event recording decreases rapidly after 2 weeks of monitoring. A 2-week monitoring period is reasonably cost-effective for most patients and should be the standard period for continuous-loop event recording for the evaluation of palpitations.

  9. Hospital staff should use more than one method to detect adverse events and potential adverse events: incident reporting, pharmacist surveillance and local real‐time record review may all have a place

    PubMed Central

    Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles

    2007-01-01

    Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203

  10. Inferring Clinical Workflow Efficiency via Electronic Medical Record Utilization

    PubMed Central

    Chen, You; Xie, Wei; Gunter, Carl A; Liebovitz, David; Mehrotra, Sanjay; Zhang, He; Malin, Bradley

    2015-01-01

    Complexity in clinical workflows can lead to inefficiency in making diagnoses, ineffectiveness of treatment plans and uninformed management of healthcare organizations (HCOs). Traditional strategies to manage workflow complexity are based on measuring the gaps between workflows defined by HCO administrators and the actual processes followed by staff in the clinic. However, existing methods tend to neglect the influences of EMR systems on the utilization of workflows, which could be leveraged to optimize workflows facilitated through the EMR. In this paper, we introduce a framework to infer clinical workflows through the utilization of an EMR and show how such workflows roughly partition into four types according to their efficiency. Our framework infers workflows at several levels of granularity through data mining technologies. We study four months of EMR event logs from a large medical center, including 16,569 inpatient stays, and illustrate that over approximately 95% of workflows are efficient and that 80% of patients are on such workflows. At the same time, we show that the remaining 5% of workflows may be inefficient due to a variety of factors, such as complex patients. PMID:26958173

  11. The Future of the Perfusion Record: Automated Data Collection vs. Manual Recording

    PubMed Central

    Ottens, Jane; Baker, Robert A.; Newland, Richard F.; Mazzone, Annette

    2005-01-01

    Abstract: The perfusion record, whether manually recorded or computer generated, is a legal representation of the procedure. The handwritten perfusion record has been the most common method of recording events that occur during cardiopulmonary bypass. This record is of significant contrast to the integrated data management systems available that provide continuous collection of data automatically or by means of a few keystrokes. Additionally, an increasing number of monitoring devices are available to assist in the management of patients on bypass. These devices are becoming more complex and provide more data for the perfusionist to monitor and record. Most of the data from these can be downloaded automatically into online data management systems, allowing more time for the perfusionist to concentrate on the patient while simultaneously producing a more accurate record. In this prospective report, we compared 17 cases that were recorded using both manual and electronic data collection techniques. The perfusionist in charge of the case recorded the perfusion using the manual technique while a second perfusionist entered relevant events on the electronic record generated by the Stockert S3 Data Management System/Data Bahn (Munich, Germany). Analysis of the two types of perfusion records showed significant variations in the recorded information. Areas that showed the most inconsistency included measurement of the perfusion pressures, flow, blood temperatures, cardioplegia delivery details, and the recording of events, with the electronic record superior in the integrity of the data. In addition, the limitations of the electronic system were also shown by the lack of electronic gas flow data in our hardware. Our results confirm the importance of accurate methods of recording of perfusion events. The use of an automated system provides the opportunity to minimize transcription error and bias. This study highlights the limitation of spot recording of perfusion events in the overall record keeping for perfusion management. PMID:16524151

  12. 49 CFR 563.1 - Scope.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...

  13. 49 CFR 563.1 - Scope.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...

  14. 49 CFR 563.1 - Scope.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...

  15. 49 CFR 563.1 - Scope.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...

  16. 49 CFR 563.1 - Scope.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...

  17. Turbidite event history--Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone

    USGS Publications Warehouse

    Goldfinger, Chris; Nelson, C. Hans; Morey, Ann E.; Johnson, Joel E.; Patton, Jason R.; Karabanov, Eugene B.; Gutierrez-Pastor, Julia; Eriksson, Andrew T.; Gracia, Eulalia; Dunhill, Gita; Enkin, Randolph J.; Dallimore, Audrey; Vallier, Tracy; Kayen, Robert; Kayen, Robert

    2012-01-01

    Turbidite systems along the continental margin of Cascadia Basin from Vancouver Island, Canada, to Cape Mendocino, California, United States, have been investigated with swath bathymetry; newly collected and archive piston, gravity, kasten, and box cores; and accelerator mass spectrometry radiocarbon dates. The purpose of this study is to test the applicability of the Holocene turbidite record as a paleoseismic record for the Cascadia subduction zone. The Cascadia Basin is an ideal place to develop a turbidite paleoseismologic method and to record paleoearthquakes because (1) a single subduction-zone fault underlies the Cascadia submarine-canyon systems; (2) multiple tributary canyons and a variety of turbidite systems and sedimentary sources exist to use in tests of synchronous turbidite triggering; (3) the Cascadia trench is completely sediment filled, allowing channel systems to trend seaward across the abyssal plain, rather than merging in the trench; (4) the continental shelf is wide, favoring disconnection of Holocene river systems from their largely Pleistocene canyons; and (5) excellent stratigraphic datums, including the Mazama ash and distinguishable sedimentological and faunal changes near the Pleistocene-Holocene boundary, are present for correlating events and anchoring the temporal framework. Multiple tributaries to Cascadia Channel with 50- to 150-km spacing, and a wide variety of other turbidite systems with different sedimentary sources contain 13 post-Mazama-ash and 19 Holocene turbidites. Likely correlative sequences are found in Cascadia Channel, Juan de Fuca Channel off Washington, and Hydrate Ridge slope basin and Astoria Fan off northern and central Oregon. A probable correlative sequence of turbidites is also found in cores on Rogue Apron off southern Oregon. The Hydrate Ridge and Rogue Apron cores also include 12-22 interspersed thinner turbidite beds respectively. We use 14C dates, relative-dating tests at channel confluences, and stratigraphic correlation of turbidites to determine whether turbidites deposited in separate channel systems are correlative - triggered by a common event. In most cases, these tests can separate earthquake-triggered turbidity currents from other possible sources. The 10,000-year turbidite record along the Cascadia margin passes several tests for synchronous triggering and correlates well with the shorter onshore paleoseismic record. The synchroneity of a 10,000-year turbidite-event record for 500 km along the northern half of the Cascadia subduction zone is best explained by paleoseismic triggering by great earthquakes. Similarly, we find a likely synchronous record in southern Cascadia, including correlated additional events along the southern margin. We examine the applicability of other regional triggers, such as storm waves, storm surges, hyperpycnal flows, and teletsunami, specifically for the Cascadia margin. The average age of the oldest turbidite emplacement event in the 10-0-ka series is 9,800±~210 cal yr B.P. and the youngest is 270±~120 cal yr B.P., indistinguishable from the A.D. 1700 (250 cal yr B.P.) Cascadia earthquake. The northern events define a great earthquake recurrence of ~500-530 years. The recurrence times and averages are supported by the thickness of hemipelagic sediment deposited between turbidite beds. The southern Oregon and northern California margins represent at least three segments that include all of the northern ruptures, as well as ~22 thinner turbidites of restricted latitude range that are correlated between multiple sites. At least two northern California sites, Trinidad and Eel Canyon/pools, record additional turbidites, which may be a mix of earthquake and sedimentologically or storm-triggered events, particularly during the early Holocene when a close connection existed between these canyons and associated river systems. The combined stratigraphic correlations, hemipelagic analysis, and 14C framework suggest that the Cascadia margin has three rupture modes: (1) 19-20 full-length or nearly full length ruptures; (2) three or four ruptures comprising the southern 50-70 percent of the margin; and (3) 18-20 smaller southern-margin ruptures during the past 10 k.y., with the possibility of additional southern-margin events that are presently uncorrelated. The shorter rupture extents and thinner turbidites of the southern margin correspond well with spatial extents interpreted from the limited onshore paleoseismic record, supporting margin segmentation of southern Cascadia. The sequence of 41 events defines an average recurrence period for the southern Cascadia margin of ~240 years during the past 10 k.y. Time-independent probabilities for segmented ruptures range from 7-12 percent in 50 years for full or nearly full margin ruptures to ~21 percent in 50 years for a southern-margin rupture. Time-dependent probabilities are similar for northern margin events at ~7-12 percent and 37-42 percent in 50 years for the southern margin. Failure analysis suggests that by the year 2060, Cascadia will have exceeded ~27 percent of Holocene recurrence intervals for the northern margin and 85 percent of recurrence intervals for the southern margin. The long earthquake record established in Cascadia allows tests of recurrence models rarely possible elsewhere. Turbidite mass per event along the Cascadia margin reveals a consistent record for many of the Cascadia turbidites. We infer that larger turbidites likely represent larger earthquakes. Mass per event and magnitude estimates also correlate modestly with following time intervals for each event, suggesting that Cascadia full or nearly full margin ruptures weakly support a time-predictable model of recurrence. The long paleoseismic record also suggests a pattern of clustered earthquakes that includes four or five cycles of two to five earthquakes during the past 10 k.y., separated by unusually long intervals. We suggest that the pattern of long time intervals and longer ruptures for the northern and central margins may be a function of high sediment supply on the incoming plate, smoothing asperities, and potential barriers. The smaller southern Cascadia segments correspond to thinner incoming sediment sections and potentially greater interaction between lower-plate and upper-plate heterogeneities. The Cascadia Basin turbidite record establishes new paleoseismic techniques utilizing marine turbidite-event stratigraphy during sea-level highstands. These techniques can be applied in other specific settings worldwide, where an extensive fault traverses a continental margin that has several active turbidite systems.

  18. Using Discrete-Event Simulation to Promote Quality Improvement and Efficiency in a Radiation Oncology Treatment Center.

    PubMed

    Famiglietti, Robin M; Norboge, Emily C; Boving, Valentine; Langabeer, James R; Buchholz, Thomas A; Mikhail, Osama

    To meet demand for radiation oncology services and ensure patient-centered safe care, management in an academic radiation oncology department initiated quality improvement efforts using discrete-event simulation (DES). Although the long-term goal was testing and deploying solutions, the primary aim at the outset was characterizing and validating a computer simulation model of existing operations to identify targets for improvement. The adoption and validation of a DES model of processes and procedures affecting patient flow and satisfaction, employee experience, and efficiency were undertaken in 2012-2013. Multiple sources were tapped for data, including direct observation, equipment logs, timekeeping, and electronic health records. During their treatment visits, patients averaged 50.4 minutes in the treatment center, of which 38% was spent in the treatment room. Patients with appointments between 10 AM and 2 PM experienced the longest delays before entering the treatment room, and those in the clinic in the day's first and last hours, the shortest (<5 minutes). Despite staffed for 14.5 hours daily, the clinic registered only 20% of patients after 2:30 PM. Utilization of equipment averaged 58%, and utilization of staff, 56%. The DES modeling quantified operations, identifying evidence-based targets for next-phase remediation and providing data to justify initiatives.

  19. ENSO-driven nutrient variability recorded by central equatorial Pacific corals

    NASA Astrophysics Data System (ADS)

    LaVigne, M.; Nurhati, I. S.; Cobb, K. M.; McGregor, H. V.; Sinclair, D. J.; Sherrell, R. M.

    2012-12-01

    Recent evidence for shifts in global ocean primary productivity suggests that surface ocean nutrient availability is a key link between global climate and ocean carbon cycling. Time-series records from satellite, in situ buoy sensors, and bottle sampling have documented the impact of the El Niño Southern Oscillation (ENSO) on equatorial Pacific hydrography and broad changes in biogeochemistry since the late 1990's, however, data are sparse prior to this. Here we use a new paleoceanographic nutrient proxy, coral P/Ca, to explore the impact of ENSO on nutrient availability in the central equatorial Pacific at higher-resolution than available from in situ nutrient data. Corals from Christmas (157°W 2°N) and Fanning (159°W 4°N) Islands recorded a well-documented decrease in equatorial upwelling as a ~40% decrease in P/Ca during the 1997-98 ENSO cycle, validating the application of this proxy to Pacific Porites corals. We compare the biogeochemical shifts observed through the 1997-98 event with two pre-TOGA-TAO ENSO cycles (1982-83 and 1986-87) reconstructed from a longer Christmas Island core. All three corals revealed ~30-40% P/Ca depletions during ENSO warming as a result of decreased regional wind stress, thermocline depth, and equatorial upwelling velocity. However, at the termination of each El Niño event, surface nutrients did not return to pre-ENSO levels for ~4-12 months after, SST as a result of increased biological draw down of surface nutrients. These records demonstrate the utility of high-resolution coral nutrient archives for understanding the impact of tropical Pacific climate on the nutrient and carbon cycling of this key region.

  20. Spatiotemporal variability of hydrometeorological extremes and their impacts in the Jihlava region in the 1650-1880 period

    NASA Astrophysics Data System (ADS)

    Dolak, Lukas; Brazdil, Rudolf; Chroma, Katerina; Valasek, Hubert; Reznickova, Ladislava

    2017-04-01

    Different documentary evidence (taxation records, chronicles, insurance reports etc.) and secondary sources (peer-reviewed papers, historical literature, newspapers) are used for reconstruction of hydrometeorological extremes (HMEs) in the former Jihlava region in the 1651-1880 period. The study describes the system of tax alleviation in Moravia, presents assessment of the impacts of HMEs with regard to physical-geographical characteristic of area studied, presents up to now non-utilized documentary evidence (early fire and hail damage insurance claims) and application of the new methodological approaches for the analysis of HMEs impacts. During the period studied more than 500 HMEs were analysed for the 19 estates (past basic economic units) in the region. Thunderstorm in 1651 in Rančířov (the Jihlava estate), which caused damage on the fields and meadows, is the first recorded extreme event. Downpours causing flash floods and hailstorms are the most frequently recorded natural disasters. Together with floods, droughts, windstorms, blizzards, late frosts and lightning strikes starting fires caused enormous damage as well. The impacts of HMEs are classified into three categories: impacts on agricultural production, material property and the socio-economic impacts. Natural disasters became the reasons of losses of human lives, property, supplies and farming equipment. HMEs caused damage to fields and meadows, depletion of livestock and triggered the secondary consequences as lack of seeds and finance, high prices, indebtedness, poverty and deterioration in field fertility. The results are discussed with respect to uncertainties associated with documentary evidences and their spatiotemporal distribution. The paper shows that particularly archival records, preserved in the Moravian Land Archives in Brno and other district archives, represent a unique source of data contributing to the better understanding of extreme events and their impacts in the past.

  1. Hydrometeorological extremes reconstructed from documentary evidence for the Jihlava region in the 17th-19th centuries

    NASA Astrophysics Data System (ADS)

    Dolak, Lukas; Brazdil, Rudolf; Chroma, Katerina; Valasek, Hubert; Belinova, Monika; Reznickova, Ladislava

    2016-04-01

    Different documentary evidence (taxation records, chronicles, insurance reports etc.) is used for reconstruction of hydrometeorological extremes (HMEs) in the Jihlava region (central part of the recent Czech Republic) in the 17th-19th centuries. The aim of the study is description of the system of tax alleviation in Moravia, presentation of utilization of early fire and hail damage insurance claims and application of the new methodological approaches for the analysis of HMEs impacts. During the period studied more than 400 HMEs were analysed for the 16 estates (past basic economic units). Late frost on 16 May 1662 on the Nove Mesto na Morave estate, which destroyed whole cereals and caused damage in the forests, is the first recorded extreme event. Downpours causing flash floods and hailstorms are the most frequently recorded natural disasters. Moreover, floods, droughts, windstorms, blizzards, late frosts and lightning strikes starting fires caused enormous damage as well. The impacts of HMEs are classified into three categories: impacts on agricultural production, material property and the socio-economic impacts. Natural disasters became the reasons of losses of human lives, property, supplies and farming equipment. HMEs caused damage to fields and meadows, depletion of livestock and triggered the secondary consequences as lack of seeds and finance, high prices, indebtedness, poverty and deterioration in field fertility. The results are discussed with respect to uncertainties associated with documentary evidences and their spatiotemporal distribution. Archival records, preserved in the Moravian Land Archives in Brno and other district archives, create a unique source of data contributing to the better understanding of extreme events and their impacts.

  2. Recording Adverse Events Following Joint Arthroplasty: Financial Implications and Validation of an Adverse Event Assessment Form.

    PubMed

    Lee, Matthew J; Mohamed, Khalid M S; Kelly, John C; Galbraith, John G; Street, John; Lenehan, Brian J

    2017-09-01

    In Ireland, funding of joint arthroplasty procedures has moved to a pay-by-results national tariff system. Typically, adverse clinical events are recorded via retrospective chart-abstraction methods by administrative staff. Missed or undocumented events not only affect the quality of patient care but also may unrealistically skew budgetary decisions that impact fiscal viability of the service. Accurate recording confers clinical benefits and financial transparency. The aim of this study was to compare a prospectively implemented adverse events form with the current national retrospective chart-abstraction method in terms of pay-by-results financial implications. An adverse events form adapted from a similar validated model was used to prospectively record complications in 51 patients undergoing total hip or knee arthroplasties. Results were compared with the same cohort using an existing data abstraction method. Both data sets were coded in accordance with current standards for case funding. Overall, 114 events were recorded during the study through prospective charting of adverse events, compared with 15 events documented by customary method (a significant discrepancy). Wound drainage (15.8%) was the most common complication, followed by anemia (7.9%), lower respiratory tract infections (7.9%), and cardiac events (7%). A total of €61,956 ($67,778) in missed funding was calculated as a result. This pilot study demonstrates the ability to improve capture of adverse events through use of a well-designed assessment form. Proper perioperative data handling is a critical aspect of financial subsidies, enabling optimal allocation of funds. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. [Constructing a database that can input record of use and product-specific information].

    PubMed

    Kawai, Satoru; Satoh, Kenichi; Yamamoto, Hideo

    2012-01-01

    In Japan, patients were infected by viral hepatitis C generally by administering a specific fibrinogen injection. However, it has been difficult to identify patients who were infected as result of the injections due to the lack of medical records. It is still not a common practice by a number of medical facilities to maintain detailed information because manual record keeping is extremely time consuming and subject to human error. Due to these reasons, the regulator required Medical device manufacturers and pharmaceutical companies to attach a bar code called "GS1-128" effective March 28, 2008. Based on this new process, we have come up with the idea of constructing a new database whose records can be entered by bar code scanning to ensure data integrity. Upon examining the efficacy of this new data collection process from the perspective of time efficiency and of course data accuracy, "GS1-128" proved that it significantly reduces time and record keeping mistakes. Patients not only became easily identifiable by a lot number and a serial number when immediate care was required, but "GS1-128" enhanced the ability to pinpoint manufacturing errors in the event any trouble or side effects are reported. This data can be shared with and utilized by the entire medical industry and will help perfect the products and enhance record keeping. I believe this new process is extremely important.

  4. The reliability of manual reporting of clinical events in an anesthesia information management system (AIMS).

    PubMed

    Simpao, Allan F; Pruitt, Eric Y; Cook-Sather, Scott D; Gurnaney, Harshad G; Rehman, Mohamed A

    2012-12-01

    Manual incident reports significantly under-report adverse clinical events when compared with automated recordings of intraoperative data. Our goal was to determine the reliability of AIMS and CQI reports of adverse clinical events that had been witnessed and recorded by research assistants. The AIMS and CQI records of 995 patients aged 2-12 years were analyzed to determine if anesthesia providers had properly documented the emesis events that were observed and recorded by research assistants who were present in the operating room at the time of induction. Research assistants recorded eight cases of emesis during induction that were confirmed with the attending anesthesiologist at the time of induction. AIMS yielded a sensitivity of 38 % (95 % confidence interval [CI] 8.5-75.5 %), while the sensitivity of CQI reporting was 13 % (95 % CI 0.3-52.7 %). The low sensitivities of the AIMS and CQI reports suggest that user-reported AIMS and CQI data do not reliably include significant clinical events.

  5. The impact of MIS-3 climate events at the transition from Neanderthals to modern humans in Europe

    NASA Astrophysics Data System (ADS)

    Staubwasser, M.; Dragusin, V.; Assonov, S.; Ersek, V.; Hoffmann, D.; Veres, D.; Onac, B. P.

    2017-12-01

    We report on last glacial stable C and O isotope records from two U-Th dated speleothems from Romania. The southerly record (Ascunsa Cave, South Carpathians) from the Danube region matches the pacing and relative change in amplitude of the Greenland ice temperature record at 30-50 ka BP as well as the abundance of coastal winter sea ice in the Black Sea. The northerly record (Tausoare Cave, East Carpathians) in parts shares the pacing of events with the Greenland or the southern Romanian record, but best matches northern Black Sea summer season temperature change. Heinrich events do not stand out in either record, but the temperature amplitudes of Greenland stadials and Black Sea records are generally reproduced. Based on similarity with the Black Sea we interpret the combined two speleothem records in terms of seasonal temperature change in central Eastern Europe. A climatic influence on the transition from Neanderthals to modern humans has long been suspected. However, the diachronous and spatially complex archaeologic succession across the Middle-Upper Paleolithic (MUPL) in Europe ( 38 - 48 ka) is difficult to reconcile with the millennial-scale pacing of northern hemisphere paleoclimate. Two extreme cold events at 44.0-43.3 recorded and 40.7-39.8 ka in the speleothems bracket the dates of the first known appearance of modern humans - the Aurignacian complex - and the disappearance of Neanderthals from most of Europe. These cold events are coeval with Greenland Stadials GS-12 and GS-10. The speleothem records generally match the paleosol/loess succession from central Europe across the MUPL. The combined record suggests that permafrost advance may have made central Europe uninhabitable at least during winter. The combined paleoclimate and archaeologic records suggest that depopulation-repopulation cycles may have occurred during and after each cold event. Repopulation of central Europe geographically favored the modern human Aurignacians from SE Europe.

  6. Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion

    NASA Astrophysics Data System (ADS)

    Witten, B.; Shragge, J. C.

    2016-12-01

    The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.

  7. Full-Waveform Envelope Templates for Low Magnitude Discrimination and Yield Estimation at Local and Regional Distances with Application to the North Korean Nuclear Tests

    NASA Astrophysics Data System (ADS)

    Yoo, S. H.

    2017-12-01

    Monitoring seismologists have successfully used seismic coda for event discrimination and yield estimation for over a decade. In practice seismologists typically analyze long-duration, S-coda signals with high signal-to-noise ratios (SNR) at regional and teleseismic distances, since the single back-scattering model reasonably predicts decay of the late coda. However, seismic monitoring requirements are shifting towards smaller, locally recorded events that exhibit low SNR and short signal lengths. To be successful at characterizing events recorded at local distances, we must utilize the direct-phase arrivals, as well as the earlier part of the coda, which is dominated by multiple forward scattering. To remedy this problem, we have developed a new hybrid method known as full-waveform envelope template matching to improve predicted envelope fits over the entire waveform and account for direct-wave and early coda complexity. We accomplish this by including a multiple forward-scattering approximation in the envelope modeling of the early coda. The new hybrid envelope templates are designed to fit local and regional full waveforms and produce low-variance amplitude estimates, which will improve yield estimation and discrimination between earthquakes and explosions. To demonstrate the new technique, we applied our full-waveform envelope template-matching method to the six known North Korean (DPRK) underground nuclear tests and four aftershock events following the September 2017 test. We successfully discriminated the event types and estimated the yield for all six nuclear tests. We also applied the same technique to the 2015 Tianjin explosions in China, and another suspected low-yield explosion at the DPRK test site on May 12, 2010. Our results show that the new full-waveform envelope template-matching method significantly improves upon longstanding single-scattering coda prediction techniques. More importantly, the new method allows monitoring seismologists to extend coda-based techniques to lower magnitude thresholds and low-yield local explosions.

  8. Analysis of real-time vibration data

    USGS Publications Warehouse

    Safak, E.

    2005-01-01

    In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.

  9. Fine grained event processing on HPCs with the ATLAS Yoda system

    NASA Astrophysics Data System (ADS)

    Calafiura, Paolo; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; Van Gemmeren, Peter; Wenaus, Torre

    2015-12-01

    High performance computing facilities present unique challenges and opportunities for HEP event processing. The massive scale of many HPC systems means that fractionally small utilization can yield large returns in processing throughput. Parallel applications which can dynamically and efficiently fill any scheduling opportunities the resource presents benefit both the facility (maximal utilization) and the (compute-limited) science. The ATLAS Yoda system provides this capability to HEP-like event processing applications by implementing event-level processing in an MPI-based master-client model that integrates seamlessly with the more broadly scoped ATLAS Event Service. Fine grained, event level work assignments are intelligently dispatched to parallel workers to sustain full utilization on all cores, with outputs streamed off to destination object stores in near real time with similarly fine granularity, such that processing can proceed until termination with full utilization. The system offers the efficiency and scheduling flexibility of preemption without requiring the application actually support or employ check-pointing. We will present the new Yoda system, its motivations, architecture, implementation, and applications in ATLAS data processing at several US HPC centers.

  10. First evaluation of the utility of GPM precipitation in global flood monitoring

    NASA Astrophysics Data System (ADS)

    Wu, H.; Yan, Y.; Gao, Z.

    2017-12-01

    The Global Flood Monitoring System (GFMS) has been developed and used to provide real-time flood detection and streamflow estimates over the last few years with significant success shown by validation against global flood event data sets and observed streamflow variations (Wu et al., 2014). It has become a tool for various national and international organizations to appraise flood conditions in various areas, including where rainfall and hydrology information is limited. The GFMS has been using the TRMM Multi-satellite Precipitation Analysis (TMPA) as its main rainfall input. Now, with the advent of the Global Precipitation Measurement (GPM) mission there is an opportunity to significantly improve global flood monitoring and forecasting. GPM's Integrated Multi-satellitE Retrievals for GPM (IMERG) multi-satellite product is designed to take advantage of various technical advances in the field and combine that with an efficient processing system producing "early" (4 hrs) and "late" (12 hrs) products for operational use. Specifically, this study is focused on (1) understanding the difference between the new IMERG products and other existing satellite precipitation products, e.g., TMPA, CMORPH, and ground observations; (2) addressing the challenge in the usage of the IMERG for flood monitoring through hydrologic models, given that only a short period of precipitation data record has been accumulated since the lunch of GPM in 2014; and (3) comparing the statistics of flood simulation based on the DRIVE model with IMERG, TMPA, CMORPH etc. as precipitation inputs respectively. Derivation of a global threshold map is a necessary step to define flood events out of modelling results, which requires a relatively longer historic information. A set of sensitivity tests are conducted by adjusting IMERG's light, moderate, heavy rain to existing precipitation products with long-term records separately, to optimize the strategy of PDF matching. Other aspects are also examined, including higher latitude events, where GPM precipitation algorithms should also provide improvements. This study provides a first evaluating the utility of the new IMERG products in flood monitoring through hydrologic modeling at a global scale.

  11. The INTIMATE event stratigraphy of the last glacial period

    NASA Astrophysics Data System (ADS)

    Olander Rasmussen, Sune; Svensson, Anders

    2015-04-01

    The North Atlantic INTIMATE (INtegration of Ice-core, MArine and TErrestrial records) group has previously recommended an Event Stratigraphy approach for the synchronisation of records of the Last Termination using the Greenland ice core records as the regional stratotypes. A key element of these protocols has been the formal definition of numbered Greenland Stadials (GS) and Greenland Interstadials (GI) within the past glacial period as the Greenland expressions of the characteristic Dansgaard-Oeschger events that represent cold and warm phases of the North Atlantic region, respectively. Using a recent synchronization of the NGRIP, GRIP, and GISP2 ice cores that allows the parallel analysis of all three records on a common time scale, we here present an extension of the GS/GI stratigraphic template to the entire glacial period. In addition to the well-known sequence of Dansgaard-Oeschger events that were first defined and numbered in the ice core records more than two decades ago, a number of short-lived climatic oscillations have been identified in the three synchronized records. Some of these events have been observed in other studies, but we here propose a consistent scheme for discriminating and naming all the significant climatic events of the last glacial period that are represented in the Greenland ice cores. In addition to presenting the updated event stratigraphy, we make a series of recommendations on how to refer to these periods in a way that promotes unambiguous comparison and correlation between different proxy records, providing a more secure basis for investigating the dynamics and fundamental causes of these climatic perturbations. The work presented is a part of a newly published paper in an INTIMATE special issue of Quaternary Science Reviews: Rasmussen et al., 'A stratigraphic framework for abrupt climatic changes during the Last Glacial period based on three synchronized Greenland ice-core records: refining and extending the INTIMATE event stratigraphy', Quaternary Science Reviews, vol. 106, p. 14-24, 2014.

  12. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    PubMed

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  13. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  14. A Codasyl-Type Schema for Natural Language Medical Records

    PubMed Central

    Sager, N.; Tick, L.; Story, G.; Hirschman, L.

    1980-01-01

    This paper describes a CODASYL (network) database schema for information derived from narrative clinical reports. The goal of this work is to create an automated process that accepts natural language documents as input and maps this information into a database of a type managed by existing database management systems. The schema described here represents the medical events and facts identified through the natural language processing. This processing decomposes each narrative into a set of elementary assertions, represented as MEDFACT records in the database. Each assertion in turn consists of a subject and a predicate classed according to a limited number of medical event types, e.g., signs/symptoms, laboratory tests, etc. The subject and predicate are represented by EVENT records which are owned by the MEDFACT record associated with the assertion. The CODASYL-type network structure was found to be suitable for expressing most of the relations needed to represent the natural language information. However, special mechanisms were developed for storing the time relations between EVENT records and for recording connections (such as causality) between certain MEDFACT records. This schema has been implemented using the UNIVAC DMS-1100 DBMS.

  15. Are Stressful Life Events (SLEs) Associated with the Utilization of Substance Use Treatment-Related Services?

    PubMed

    Cruz-Feliciano, Miguel A; Ferraro, Aimee; Witt Prehn, Angela

    2017-03-01

    This study described herein explored the association of stressful life events with the utilization of substance use treatment-related services among substance users living in Puerto Rico. A secondary data analysis was conducted using data collected by a research project entitled Puerto Rico Drug Abuse Research Development Program II (PRDARDP II). The study population consisted of 378 individuals from 18 to 35 years of age who were residents of the San Juan metropolitan area and who presented evidence of substance use in the 30 days prior to the interview. The analysis considered demographic data, information on patterns of substance use, substance use treatment history, stressful events, and depression and anxiety symptomatology. As the number of stressful life events increased, substance users were more likely to report having utilized substance use treatment-related services (OR = 1.11, 95% CI [1.06, 1.17], p < 0.001). Relapsing, the inability to afford drugs, and poor working conditions were statistically significant stressful life events associated with the utilization of substance use treatment-related services. Despite the structural limitations associated with access to and with the quality of the services in the substance use treatment-related system of Puerto Rico, findings suggest that stressful life events play a significant role in the utilization of those services. Researchers and clinicians should consider screening for stressful life events in outreach and engagement strategies. At the same time, the assessment of stressful life events should be integrated into the treatment planning stage to support the recovery process of people with substance use disorders.

  16. Adverse events attributed to traditional Korean medical practices: 1999–2010

    PubMed Central

    Shin, Hyeun-Kyoo; Jeong, Soo-Jin; Ernst, Edzard

    2013-01-01

    Abstract Objective To investigate adverse events attributed to traditional medical treatments in the Republic of Korea. Methods Adverse events recorded in the Republic of Korea between 1999 and 2010 – by the Food and Drug Administration, the Consumer Agency or the Association of Traditional Korean Medicine – were reviewed. Records of adverse events attributed to the use of traditional medical practices, including reports of medicinal accidents and consumers’ complaints, were investigated. Findings Overall, 9624 records of adverse events attributed to traditional medical practices – including 522 linked to herbal treatments – were identified. Liver problems were the most frequently reported adverse events. Only eight of the adverse events were recorded by the pharmacovigilance system run by the Food and Drug Administration. Of the 9624 events, 1389 – mostly infections, cases of pneumothorax and burns – were linked to physical therapy (n = 285) or acupuncture/moxibustion (n = 1104). Conclusion In the Republic of Korea, traditional medical practices often appear to have adverse effects, yet almost all of the adverse events attributed to such practices between 1999 and 2010 were missed by the national pharmacovigilance system. The Consumer Agency and the Association of Traditional Korean Medicine should be included in the national pharmacovigilance system. PMID:23940404

  17. Pain Now or Later: An Outgrowth Account of Pain-Minimization

    PubMed Central

    Chen, Shuai; Zhao, Dan; Rao, Li-Lin; Liang, Zhu-Yuan; Li, Shu

    2015-01-01

    The preference for immediate negative events contradicts the minimizing loss principle given that the value of a delayed negative event is discounted by the amount of time it is delayed. However, this preference is understandable if we assume that the value of a future outcome is not restricted to the discounted utility of the outcome per se but is complemented by an anticipated negative utility assigned to an unoffered dimension, which we termed the “outgrowth.” We conducted three studies to establish the existence of the outgrowth and empirically investigated the mechanism underlying the preference for immediate negative outcomes. Study 1 used a content analysis method to examine whether the outgrowth was generated in accompaniment with the delayed negative events. The results revealed that the investigated outgrowth was composed of two elements. The first component is the anticipated negative emotions elicited by the delayed negative event, and the other is the anticipated rumination during the waiting process, in which one cannot stop thinking about the negative event. Study 2 used a follow-up investigation to examine whether people actually experienced the negative emotions they anticipated in a real situation of waiting for a delayed negative event. The results showed that the participants actually experienced a number of negative emotions when waiting for a negative event. Study 3 examined whether the existence of the outgrowth could make the minimizing loss principle work. The results showed that the difference in pain anticipation between the immediate event and the delayed event could significantly predict the timing preference of the negative event. Our findings suggest that people’s preference for experiencing negative events sooner serves to minimize the overall negative utility, which is divided into two parts: the discounted utility of the outcome itself and an anticipated negative utility assigned to the outgrowth. PMID:25747461

  18. Frequency of extreme weather events and increased risk of motor vehicle collision in Maryland.

    PubMed

    Liu, Ann; Soneja, Sutyajeet I; Jiang, Chengsheng; Huang, Chanjuan; Kerns, Timothy; Beck, Kenneth; Mitchell, Clifford; Sapkota, Amir

    2017-02-15

    Previous studies have shown increased precipitation to be associated with higher frequency of traffic collisions. However, data regarding how extreme weather events, projected to grow in frequency, intensity, and duration in response to a changing climate, might affect the risk of motor vehicle collisions is particularly limited. We investigated the association between frequency of extreme heat and precipitation events and risk of motor vehicle collision in Maryland between 2000 and 2012. Motor vehicle collision data was obtained from the Maryland Automated Accident Reporting System. Each observation in the data set corresponded to a unique collision event. This data was linked to extreme heat and precipitation events that were calculated using location and calendar day specific thresholds. A time-stratified case-crossover analysis was utilized to assess the association between exposure to extreme heat and precipitation events and risk of motor vehicle collision. Additional stratified analyses examined risk by road condition, season, and collisions involving only one vehicle. Overall, there were over 1.28 million motor vehicle collisions recorded in Maryland between 2000 and 2012, of which 461,009 involved injuries or death. There was a 23% increase in risk of collision for every 1-day increase in extreme precipitation event (Odds Ratios (OR) 1.23, 95% Confidence Interval (CI): 1.22, 1.27). This risk was considerably higher for collisions on roads with a defect or obstruction (OR: 1.46, 95% CI: 1.40, 1.52) and those involving a single vehicle (OR: 1.41, 95% CI: 1.39, 1.43). Change in risk associated with extreme heat events was marginal at best. Extreme precipitation events are associated with an increased risk of motor vehicle collisions in Maryland. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Classification and evaluation of the documentary-recorded storm events in the Annals of the Choson Dynasty (1392-1910), Korea

    NASA Astrophysics Data System (ADS)

    Yoo, Chulsang; Park, Minkyu; Kim, Hyeon Jun; Choi, Juhee; Sin, Jiye; Jun, Changhyun

    2015-01-01

    In this study, the analysis of documentary records on the storm events in the Annals of the Choson Dynasty, covering the entire period of 519 years from 1392 to 1910, was carried out. By applying various key words related to storm events, a total of 556 documentary records could be identified. The main objective of this study was to develop rules of classification for the documentary records on the storm events in the Annals of the Choson Dynasty. The results were also compared with the rainfall data of the traditional Korean rain gauge, named Chukwooki, which are available from 1777 to 1910 (about 130 years). The analysis is organized as follows. First, the frequency of the documents, their length, comments about the size of the inundated area, the number of casualties, the number of property losses, and the size of the countermeasures, etc. were considered to determine the magnitude of the events. To this end, rules of classification of the storm events are developed. Cases in which the word 'disaster' was used along with detailed information about the casualties and property damages, were classified as high-level storm events. The high-level storm events were additionally sub-categorized into catastrophic, extreme, and severe events. Second, by applying the developed rules of classification, a total of 326 events were identified as high-level storm events during the 519 years of the Choson Dynasty. Among these high-level storm events, only 19 events were then classified as the catastrophic ones, 106 events as the extreme ones, and 201 events as the severe ones. The mean return period of these storm events was found to be about 30 years for the catastrophic events, 5 years for the extreme events, and 2-3 years for the severe events. Third, the classification results were verified considering the records of the traditional Korean rain gauge; it was found that the catastrophic events are strongly distinguished from other events with a mean total rainfall and a storm duration equal to 439.8 mm and 49.3 h, respectively. The return period of these catastrophic events was also estimated to be in the range 100-500 years.

  20. Data quality of seismic records from the Tohoku, Japan earthquake as recorded across the Albuquerque Seismological Laboratory networks

    USGS Publications Warehouse

    Ringler, A.T.; Gee, L.S.; Marshall, B.; Hutt, C.R.; Storm, T.

    2012-01-01

    Great earthquakes recorded across modern digital seismographic networks, such as the recent Tohoku, Japan, earthquake on 11 March 2011 (Mw = 9.0), provide unique datasets that ultimately lead to a better understanding of the Earth's structure (e.g., Pesicek et al. 2008) and earthquake sources (e.g., Ammon et al. 2011). For network operators, such events provide the opportunity to look at the performance across their entire network using a single event, as the ground motion records from the event will be well above every station's noise floor.

  1. Delineating Concealed Faults within Cogdell Oil Field via Earthquake Detection

    NASA Astrophysics Data System (ADS)

    Aiken, C.; Walter, J. I.; Brudzinski, M.; Skoumal, R.; Savvaidis, A.; Frohlich, C.; Borgfeldt, T.; Dotray, P.

    2016-12-01

    Cogdell oil field, located within the Permian Basin of western Texas, has experienced several earthquakes ranging from magnitude 1.7 to 4.6, most of which were recorded since 2006. Using the Earthscope USArray, Gan and Frohlich [2013] relocated some of these events and found a positive correlation in the timing of increased earthquake activity and increased CO2 injection volume. However, focal depths of these earthquakes are unknown due to 70 km station spacing of the USArray. Accurate focal depths as well as new detections can delineate subsurface faults and establish whether earthquakes are occurring in the shallow sediments or in the deeper basement. To delineate subsurface fault(s) in this region, we first detect earthquakes not currently listed in the USGS catalog by applying continuous waveform-template matching algorithms to multiple seismic data sets. We utilize seismic data spanning the time frame of 2006 to 2016 - which includes data from the U.S. Geological Survey Global Seismographic Network, the USArray, and the Sweetwater, TX broadband and nodal array located 20-40 km away. The catalog of earthquakes enhanced by template matching reveals events that were well recorded by the large-N Sweetwater array, so we are experimenting with strategies for optimizing template matching using different configurations of many stations. Since earthquake activity in the Cogdell oil field is on-going (a magnitude 2.6 occurred on May 29, 2016), a temporary deployment of TexNet seismometers has been planned for the immediate vicinity of Cogdell oil field in August 2016. Results on focal depths and detection of small magnitude events are pending this small local network deployment.

  2. Calculations of the FLAX events with comparisons to particle velocity data recorded at low stress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rambo, J.

    1993-09-01

    The FLAX event, fired in 1972, produced two particle velocity data sets from two devices in the same hole, U2dj. The data are of interest because they contain verification of focusing of a shock wave above the water table. The FLAX data show the peak velocity attenuation from the device buried in saturated tuff are different from those emanating from the upper device buried in porous alluvium. The attenuations of the peaks are different in regions traversed by both waves traveling at the same sound speed and measured by the same particle velocity gages. The attenuation rate from the lowermore » device is due to 2-D effects attributed to wave focusing above the water table and is a feature that should be captured by 2-D calculations. LLNL`s KDYNA calculations used for containment analyses have utilized a material model initially developed by Butkovich, which estimates strength and compressibility based on gas porosity, total porosity, and water content determined from geophysical measurements. Unfortunately, the material model estimates do not correctly model the more important details of strength and compressibility used for matching the velocity data. The velocity gage data contain information that can be related to the strength properties of the medium, provided that there are more than two gages recording in the stress region of plastic deformation of the material. A modification to Butkovich`s model incorporated approximate strengths derived from the data. The mechanisms of focusing will be discussed and will incorporate additional information from the TYBO event.« less

  3. Velocity and Attenuation Structure of the Tibetan Lithosphere using Seismic Attributes of P-waves from Regional Earthquakes Recorded by the Hi-CLIMB Array

    NASA Astrophysics Data System (ADS)

    Nowack, R. L.; Bakir, A. C.; Griffin, J.; Chen, W.; Tseng, T.

    2010-12-01

    Using data from regional earthquakes recorded by the Hi-CLIMB array in Tibet, we utilize seismic attributes from crustal and Pn arrivals to constrain the velocity and attenuation structure in the crust and the upper mantle in central and western Tibet. The seismic attributes considered include arrival times, Hilbert envelope amplitudes, and instantaneous as well as spectral frequencies. We have constructed more than 30 high-quality regional seismic profiles, and of these, 10 events have been selected with excellent crustal and Pn arrivals for further analysis. Travel-times recorded by the Hi-CLIMB array are used to estimate the large-scale velocity structure in the region, with four near regional events to the array used to constrain the crustal structure. The travel times from the far regional events indicate that the Moho beneath the southern Lhasa terrane is up to 75 km thick, with Pn velocities greater than 8 km/s. In contrast, the data sampling the Qiangtang terrane north of the Bangong-Nujiang (BNS) suture shows thinner crust with Pn velocities less than 8 km/s. Seismic amplitude and frequency attributes have been extracted from the crustal and Pn wave trains, and these data are compared with numerical results for models with upper-mantle velocity gradients and attenuation, which can strongly affect Pn amplitudes and pulse frequencies. The numerical modeling is performed using the complete spectral element method (SEM), where the results from the SEM method are in good agreement with analytical and reflectivity results for different models with upper-mantle velocity gradients. The results for the attenuation modeling in Tibet imply lower upper mantle Q values in the Qiangtang terrane to the north of the BNS compared to the less attenuative upper mantle beneath the Lhasa terrane to the south of the BNS.

  4. Event attribution: Human influence on the record-breaking cold event in January of 2016 in Eastern China

    NASA Astrophysics Data System (ADS)

    Qian, C.; Wang, J.; Dong, S.; Yin, H.; Burke, C.; Ciavarella, A.; Dong, B.; Freychet, N.; Lott, F. C.; Tett, S. F.

    2017-12-01

    It is controversial whether Asian mid-latitude cold surges are becoming more likely as a consequence of Arctic warming. Here, we present an event attribution study in mid-latitude Eastern China. A strong cold surge occurred during 21st-25th January 2016 affecting most areas of China, especially Eastern China. Daily minimum temperature (Tmin) records were broken at many stations. The area averaged anomaly of Tmin over the region (20-44N, 100-124E) for this pentad was the lowest temperature recorded since modern meteorological observations started in 1960. This cold event occurred in a background of the warmest winter Tmin since 1960. Given the vast damages caused by this extreme cold event in Eastern China and the previous mentioned controversy, it is compelling to investigate how much anthropogenic forcing agents have affected the probability of cold events with an intensity equal to or larger than the January 2016 extreme event. We use the Met Office Hadley Centre system for Attribution of extreme weather and Climate Events and station observations to investigate the effect of anthropogenic forcings on the likelihood of such a cold event. Anthropogenic influences are estimated to have reduced the likelihood of an extreme cold event in mid-winter with the intensity equal to or stronger than the record of 2016 in Eastern China by about 2/3.

  5. Using EPA Tools and Data Services to Inform Changes to Design Storm Definitions for Wastewater Utilities based on Climate Model Projections

    NASA Astrophysics Data System (ADS)

    Tryby, M.; Fries, J. S.; Baranowski, C.

    2014-12-01

    Extreme precipitation events can cause significant impacts to drinking water and wastewater utilities, including facility damage, water quality impacts, service interruptions and potential risks to human health and the environment due to localized flooding and combined sewer overflows (CSOs). These impacts will become more pronounced with the projected increases in frequency and intensity of extreme precipitation events due to climate change. To model the impacts of extreme precipitation events, wastewater utilities often develop Intensity, Duration, and Frequency (IDF) rainfall curves and "design storms" for use in the U.S. Environmental Protection Agency's (EPA) Storm Water Management Model (SWMM). Wastewater utilities use SWMM for planning, analysis, and facility design related to stormwater runoff, combined and sanitary sewers, and other drainage systems in urban and non-urban areas. SWMM tracks (1) the quantity and quality of runoff made within each sub-catchment; and (2) the flow rate, flow depth, and quality of water in each pipe and channel during a simulation period made up of multiple time steps. In its current format, EPA SWMM does not consider climate change projection data. Climate change may affect the relationship between intensity, duration, and frequency described by past rainfall events. Therefore, EPA is integrating climate projection data available in the Climate Resilience Evaluation and Awareness Tool (CREAT) into SWMM. CREAT is a climate risk assessment tool for utilities that provides downscaled climate change projection data for changes in the amount of rainfall in a 24-hour period for various extreme precipitation events (e.g., from 5-year to 100-year storm events). Incorporating climate change projections into SWMM will provide wastewater utilities with more comprehensive data they can use in planning for future storm events, thereby reducing the impacts to the utility and customers served from flooding and stormwater issues.

  6. Diagnosing psychogenic nonepileptic seizures: Video-EEG monitoring, suggestive seizure induction and diagnostic certainty.

    PubMed

    Popkirov, Stoyan; Jungilligens, Johannes; Grönheit, Wenke; Wellmer, Jörg

    2017-08-01

    Psychogenic nonepileptic seizures (PNES) can remain undiagnosed for many years, leading to unnecessary medication and delayed treatment. A recent report by the International League Against Epilepsy Nonepileptic Seizures Task Force recommends a staged approach to the diagnosis of PNES (LaFrance, et al., 2013). We aimed to investigate its practical utility, and to apply the proposed classification to evaluate the role of long-term video-EEG monitoring (VEEG) and suggestive seizure induction (SSI) in PNES workup. Using electronic medical records, 122 inpatients (mean age 36.0±12.9years; 68% women) who received the diagnosis of PNES at our epilepsy center during a 4.3-year time period were included. There was an 82.8% agreement between diagnostic certainty documented at discharge and that assigned retroactively using the Task Force recommendations. In a minority of cases, having used the Task Force criteria could have encouraged the clinicians to give more certain diagnoses, exemplifying the Task Force report's utility. Both VEEG and SSI were effective at supporting high level diagnostic certainty. Interestingly, about one in four patients (26.2%) had a non-diagnostic ("negative") VEEG but a positive SSI. On average, this subgroup did not have significantly shorter mean VEEG recording times than VEEG-positive patients. However, VEEG-negative/SSI-positive patients had a significantly lower habitual seizure frequency than their counterparts. This finding emphasizes the utility of SSI in ascertaining the diagnosis of PNES in patients who do not have a spontaneous habitual event during VEEG due to, for example, low seizure frequency. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Patient-reported adverse events after hernia surgery and socio-economic status: A register-based cohort study.

    PubMed

    Wefer, Agnes; Gunnarsson, Ulf; Fränneby, Ulf; Sandblom, Gabriel

    2016-11-01

    The aim of the present study was to assess how socio-economic background influences perception of an adverse postoperative event after hernia surgery, and to see if this affects the pattern of seeking healthcare advice during the early postoperative period. All patients aged 15 years or older with a primary unilateral inguinal or femoral hernia repair recorded in the Swedish Hernia Register (SHR) between November 1 and December 31, 2002 were sent a questionnaire inquiring about adverse events. Data on civil status, income, level of education and ethnic background were obtained from Statistics Sweden. Of the 1643 patients contacted, 1440 (87.6%) responded: 1333 (92.6%) were men and 107 (7.4%) women, mean age was 59 years. There were 203 (12.4%) non-responders. Adverse events were reported in the questionnaire by 390 (27.1%) patients. Patients born in Sweden and patients with high income levels reported a significantly higher incidence of perceived adverse events (p < 0.05). Patients born in Sweden and females reported more events requiring healthcare contact. There was no association between registered and self-reported outcome and civil status or level of education. We detected inequalities related to income level, gender and ethnic background. Even if healthcare utilization is influenced by socio-economic background, careful information of what may be expected in the postoperative period and how adverse events should be managed could lead to reduced disparity and improved quality of care in the community at large. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  8. Using the Statecharts paradigm for simulation of patient flow in surgical care.

    PubMed

    Sobolev, Boris; Harel, David; Vasilakis, Christos; Levy, Adrian

    2008-03-01

    Computer simulation of patient flow has been used extensively to assess the impacts of changes in the management of surgical care. However, little research is available on the utility of existing modeling techniques. The purpose of this paper is to examine the capacity of Statecharts, a system of graphical specification, for constructing a discrete-event simulation model of the perioperative process. The Statecharts specification paradigm was originally developed for representing reactive systems by extending the formalism of finite-state machines through notions of hierarchy, parallelism, and event broadcasting. Hierarchy permits subordination between states so that one state may contain other states. Parallelism permits more than one state to be active at any given time. Broadcasting of events allows one state to detect changes in another state. In the context of the peri-operative process, hierarchy provides the means to describe steps within activities and to cluster related activities, parallelism provides the means to specify concurrent activities, and event broadcasting provides the means to trigger a series of actions in one activity according to transitions that occur in another activity. Combined with hierarchy and parallelism, event broadcasting offers a convenient way to describe the interaction of concurrent activities. We applied the Statecharts formalism to describe the progress of individual patients through surgical care as a series of asynchronous updates in patient records generated in reaction to events produced by parallel finite-state machines representing concurrent clinical and managerial activities. We conclude that Statecharts capture successfully the behavioral aspects of surgical care delivery by specifying permissible chronology of events, conditions, and actions.

  9. 49 CFR 229.27 - Annual tests.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DMU locomotive or an MU locomotive, equipped with a microprocessor-based event recorder that includes...) A microprocessor-based event recorder with a self-monitoring feature equipped to verify that all...

  10. Long-Term Memory: A Natural Mechanism for the Clustering of Extreme Events and Anomalous Residual Times in Climate Records

    NASA Astrophysics Data System (ADS)

    Bunde, Armin; Eichner, Jan F.; Kantelhardt, Jan W.; Havlin, Shlomo

    2005-01-01

    We study the statistics of the return intervals between extreme events above a certain threshold in long-term persistent records. We find that the long-term memory leads (i)to a stretched exponential distribution of the return intervals, (ii)to a pronounced clustering of extreme events, and (iii)to an anomalous behavior of the mean residual time to the next event that depends on the history and increases with the elapsed time in a counterintuitive way. We present an analytical scaling approach and demonstrate that all these features can be seen in long climate records. The phenomena should also occur in heartbeat records, Internet traffic, and stock market volatility and have to be taken into account for an efficient risk evaluation.

  11. [Fall events in geriatric hospital in-patients. Results of prospective recording over a 3 year period].

    PubMed

    von Renteln-Kruse, W; Krause, T

    2004-02-01

    For a period of 3 consecutive years, all fall events were prospectively recorded in geriatric hospital in-patients by using a standardized protocol. The incidence was 9.1 fall events/1000 hospital days in 5946 patients, and 41.0/1000 hospital days in 1015 patients (17.0%) who actually had falls. The fall rate varied between 35.0-57.0/1000 hospital days according to the main diagnostic group. Fall events were more often recorded in men than women. Recurrent falls (> or =3 falls) which contributed 13% to the 1596 falls were recorded more frequently in male patients. The majority of fall events (73.5%) occurred in patient rooms, another 20% on the floor between the patient's bedroom and toilet/bath, or in the toilet/bath, respectively. The absolute numbers of falls during night and day were not different. However, there were different patterns in the time distribution of high fall frequencies according to the main diagnostic groups. Confusion and dehydration were recorded more frequently with fall events in patients 80 years and older, and more often in fall events during the night. Injuries due to falls which had to be treated were rare, and fall-related fractures were very rare. The average duration of in-hospital stay was longer for patients with than without falls.

  12. A PDA-based system for online recording and analysis of concurrent events in complex behavioral processes.

    PubMed

    Held, Jürgen; Manser, Tanja

    2005-02-01

    This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.

  13. Oscillatory patterns in temporal lobe reveal context reinstatement during memory search.

    PubMed

    Manning, Jeremy R; Polyn, Sean M; Baltuch, Gordon H; Litt, Brian; Kahana, Michael J

    2011-08-02

    Psychological theories of memory posit that when people recall a past event, they not only recover the features of the event itself, but also recover information associated with other events that occurred nearby in time. The events surrounding a target event, and the thoughts they evoke, may be considered to represent a context for the target event, helping to distinguish that event from similar events experienced at different times. The ability to reinstate this contextual information during memory search has been considered a hallmark of episodic, or event-based, memory. We sought to determine whether context reinstatement may be observed in electrical signals recorded from the human brain during episodic recall. Analyzing electrocorticographic recordings taken as 69 neurosurgical patients studied and recalled lists of words, we uncovered a neural signature of context reinstatement. Upon recalling a studied item, we found that the recorded patterns of brain activity were not only similar to the patterns observed when the item was studied, but were also similar to the patterns observed during study of neighboring list items, with similarity decreasing reliably with positional distance. The degree to which individual patients displayed this neural signature of context reinstatement was correlated with their tendency to recall neighboring list items successively. These effects were particularly strong in temporal lobe recordings. Our findings show that recalling a past event evokes a neural signature of the temporal context in which the event occurred, thus pointing to a neural basis for episodic memory.

  14. Modeling Events in the Lower Imperial Valley Basin

    NASA Astrophysics Data System (ADS)

    Tian, X.; Wei, S.; Zhan, Z.; Fielding, E. J.; Helmberger, D. V.

    2010-12-01

    The Imperial Valley below the US-Mexican border has few seismic stations but many significant earthquakes. Many of these events, such as the recent El Mayor-Cucapah event, have complex mechanisms involving a mixture of strike-slip and normal slip patterns with now over 30 aftershocks with magnitude over 4.5. Unfortunately, many earthquake records from the Southern Imperial Valley display a great deal of complexity, ie., strong Rayleigh wave multipathing and extended codas. In short, regional recordings in the US are too complex to easily separate source properties from complex propagation. Fortunately, the Dec 30 foreshock (Mw=5.9) has excellent recordings teleseismically and regionally, and moreover is observed with InSAR. We use this simple strike-slip event to calibrate paths. In particular, we are finding record segments involving Pnl (including depth phases) and some surface waves (mostly Love waves) that appear well behaved, ie., can be approximated by synthetics from 1D local models and events modeled with the Cut-and-Paste (CAP) routine. Simple events can then be identified along with path calibration. Modeling the more complicated paths can be started with known mechanisms. We will report on both the aftershocks and historic events.

  15. Decadal-scale progression of Dansgaard-Oeschger warming events - Are warmings at the end of Heinrich-Stadials different from others?

    NASA Astrophysics Data System (ADS)

    Erhardt, T.; Capron, E.; Rasmussen, S.; Schuepbach, S.; Bigler, M.; Fischer, H.

    2017-12-01

    During the last glacial period proxy records throughout the Northern Hemisphere document a succession of rapid millennial-scale warming events, called Dansgaard Oeschger (DO) events. Marine proxy records from the Atlantic also reveal, that some of the warming events where preceded by large ice rafting events, referred to as Heinrich events. Different mechanisms have been proposed, that can produce DO-like warming in model experiments, however the progression and plausible trigger of the events and their possible interplay with the Heinrich events is still unknown. Because of their fast nature, the progression is challenging to reconstruct from paleoclimate data due to the temporal resolution achievable in many archives and cross-dating uncertainties between records. We use new high-resolution multi-proxy records of sea-salt and terrestrial aerosol concentrations over the period 10-60 ka from two Greenland deep ice cores in conjunction with local precipitation and temperature proxy records from one of the cores to investigate the progression of environmental changes at the onset of the individual warming events. The timing differences are then used to explore whether the DO warming events that terminate Heinrich-Stadials progressed differently in comparison to those after Non-Heinrich-Stadials. Our analysis indicates no difference in the progression of the warming terminating Heinrich-Stadials and Non-Heinrich-Stadials. Combining the evidence from all warming events in the period, our analysis shows a consistent lead of the changes in both local precipitation and terrestrial dust aerosol concentrations over the change in sea-salt aerosol concentrations and local temperature by approximately one decade. This implies that both the moisture transport to Greenland and the intensity of the Asian winter monsoon changed before the sea-ice cover in the North Atlantic was reduced, rendering a collapse of the sea-ice cover as a trigger for the DO events unlikely.

  16. Areas with High Rates of Police-Reported Violent Crime Have Higher Rates of Childhood Asthma Morbidity.

    PubMed

    Beck, Andrew F; Huang, Bin; Ryan, Patrick H; Sandel, Megan T; Chen, Chen; Kahn, Robert S

    2016-06-01

    To assess whether population-level violent (and all) crime rates were associated with population-level child asthma utilization rates and predictive of patient-level risk of asthma reutilization after a hospitalization. A retrospective cohort study of 4638 pediatric asthma-related emergency department visits and hospitalizations between 2011 and 2013 was completed. For population-level analyses, census tract asthma utilization rates were calculated by dividing the number of utilization events within a tract by the child population. For patient-level analyses, hospitalized patients (n = 981) were followed until time of first asthma-related reutilization. The primary predictor was the census tract rate of violent crime as recorded by the police; the all crime (violent plus nonviolent) rate was also assessed. Census tract-level violent and all crime rates were significantly correlated with asthma utilization rates (both P < .0001). The violent crime rate explained 35% of the population-level asthma utilization variance and remained associated with increased utilization after adjustment for census tract poverty, unemployment, substandard housing, and traffic exposure (P = .002). The all crime rate explained 28% of the variance and was similarly associated with increased utilization after adjustment (P = .02). Hospitalized children trended toward being more likely to reutilize if they lived in higher violent (P = .1) and all crime areas (P = .01). After adjustment, neither relationship was significant. Crime data could help facilitate early identification of potentially toxic stressors relevant to the control of asthma for populations and patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Areas with high rates of police-reported violent crime have higher rates of childhood asthma morbidity

    PubMed Central

    Beck, Andrew F.; Huang, Bin; Ryan, Patrick H.; Sandel, Megan T.; Chen, Chen; Kahn, Robert S.

    2016-01-01

    Objectives To assess whether population-level violent (and all) crime rates were associated with population-level child asthma utilization rates and predictive of patient-level risk of asthma reutilization after a hospitalization. Study design A retrospective cohort study of 4,638 pediatric asthma-related emergency department visits and hospitalizations between 2011 and 2013 was completed. For population-level analyses, census tract asthma utilization rates were calculated by dividing the number of utilization events within a tract by the child population. For patient-level analyses, hospitalized patients (n=981) were followed until time of first asthma-related reutilization. The primary predictor was the census tract rate of violent crime as recorded by the police; the all crime (violent plus non-violent) rate was also assessed. Results Census tract-level violent and all crime rates were significantly correlated with asthma utilization rates (both p<.0001). The violent crime rate explained 35% of the population-level asthma utilization variance and remained associated with increased utilization after adjustment for census tract poverty, unemployment, substandard housing, and traffic exposure (p=.002). The all crime rate explained 28% of the variance and was similarly associated with increased utilization after adjustment (p=.02). Hospitalized children trended toward being more likely to reutilize if they lived in higher violent (p=.1) and all crime areas (p=.01). After adjustment, neither relationship was significant. Conclusions Crime data could help facilitate early identification of potentially toxic stressors relevant to the control of asthma for populations and patients. PMID:26960918

  18. 78 FR 44052 - Airworthiness Directives; Sikorsky Aircraft Corporation (Sikorsky) Model Helicopters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-23

    ... a regulatory distinction; and 4. Will not have a significant economic impact, positive or negative... events (LCF1) and partial low cycle fatigue events (LCF2) as those terms are defined in the... the full and partial low fatigue cycle events and record on the component card or equivalent record...

  19. Combining geomorphic and documentary flood evidence to reconstruct extreme events in Mediterranean basins

    NASA Astrophysics Data System (ADS)

    Thorndycraft, V. R.; Benito, G.; Barriendos, M.; Rico, M.; Sánchez-Moya, Y.; Sopeña, A.; Casas, A.

    2009-09-01

    Palaeoflood hydrology is the reconstruction of flood magnitude and frequency using geomorphological flood evidence and is particularly valuable for extending the record of extreme floods prior to the availability of instrumental data series. This paper will provide a review of recent developments in palaeoflood hydrology and will be presented in three parts: 1) an overview of the key methodological approaches used in palaeoflood hydrology and the use of historical documentary evidence for reconstructing extreme events; 2) a summary of the Llobregat River palaeoflood case study (Catalonia, NE Spain); and 3) analysis of the AD 1617 flood and its impacts across Catalonia (including the rivers Llobregat, Ter and Segre). The key findings of the Llobregat case study were that at least eight floods occurred with discharges significantly larger than events recorded in the instrumental record, for example at the Pont de Vilomara study reach the palaeodischarges of these events were 3700-4300 m3/s compared to the 1971 flood, the largest on record, of 2300 m3/s. Five of these floods were dated to the last 3000 years and the three events directly dated by radiocarbon all occurred during cold phases of global climate. Comparison of the palaeoflood record with documentary evidence indicated that one flood, radiocarbon dated to cal. AD 1540-1670, was likely to be the AD 1617 event, the largest flood of the last 700 years. Historical records indicate that this event was caused by rainfall occurring from the 2nd to 6th November and the resultant flooding caused widespread socio-economic impacts including the destruction of at least 389 houses, 22 bridges and 17 water mills. Discharges estimated from palaeoflood records and historical flood marks indicate that the Llobregat (4680 m3/s) and Ter (2700-4500 m3/s) rivers witnessed extreme discharges in comparison to observed floods in the instrumental record (2300 and 2350 m3/s, respectively); whilst further east in the Segre River there was no geomorphic evidence of any flooding of greater magnitude than 2000 m3/s, or the 1982 event.

  20. Adding video recording increases the diagnostic yield of routine electroencephalograms in children with frequent paroxysmal events.

    PubMed

    Watemberg, Nathan; Tziperman, Barak; Dabby, Ron; Hasan, Mariana; Zehavi, Liora; Lerman-Sagie, Tally

    2005-05-01

    To report on the usefulness of adding video recording to routine EEG studies of infants and children with frequent paroxysmal events. We analyzed the efficacy of this diagnostic means during a 4-year period. The decision whether to add video recording was made by the pediatric EEG interpreter at the time of the study. Studies were planned to last between 20 and 30 min, and, if needed, were extended by the EEG interpreter. For most studies, video recording was added from the beginning of EEG recording. In a minority of cases, the addition of video was implemented during the first part of the EEG test, as clinical events became obvious. In these cases, a new study (file) was begun. The success rate was analyzed according to the indications for the EEG study: paroxysmal eye movements, tremor, suspected seizures, myoclonus, staring episodes, suspected stereotypias and tics, absence epilepsy follow-up, cyanotic episodes, and suspected psychogenic nonepileptic events. Video recording was added to 137 of 666 routine studies. Mean patient age was 4.8 years. The nature of the event was determined in 61 (45%) of the EEG studies. Twenty-eight percent were hospitalized patients. The average study duration was 26 min. This diagnostic means was particularly useful for paroxysmal eye movements, staring spells, myoclonic jerks, stereotypias, and psychogenic nonepileptic events. About 46% of 116 patients for whom cognitive data were available were mentally retarded. EEG with added video recording was successfully performed in all 116 cases and provided useful information in 29 (55%) of these 53 patients. Adding video recording to routine EEG was helpful in 45% of cases referred for frequent paroxysmal events. This technique proved useful for hospitalized children as well as for outpatients. Moreover, it was successfully applied in cognitively impaired patients. Infants and children with paroxysmal eye movements, staring spells, myoclonic jerks, stereotypias, and pseudoseizures especially benefited from this diagnostic means. Because of its low cost and the little discomfort imposed on the patient and his or her family, this technique should be considered as a first diagnostic step in children with frequent paroxysmal events.

  1. Discussion of “Deglacial paleoclimate in the southwestern United States: an abrupt 18.6 cold event and evidence for a North Atlantic forcing of Termination I” by M.S. Lachniet, Y. Asmerom and V. Polyak

    USGS Publications Warehouse

    Winograd, Isaac J.

    2012-01-01

    Utilizing a stable isotopic time series obtained from a speleothem (PC-1), which grew between 20.1 and 15.6 ka, Lachniet, Asmeron and Polyak (2011; hereafter LAP) present evidence for a significant cold event in the southern Great Basin at 18.6 ka, a finding that we accept. Supplementing this short record with a literature review, they go on to claim, as their central thesis, that the paleoclimate of the southwestern US was driven by “the transmission of atmospheric anomalies to the southwest…that coincided with deglacial climate changes in Greenland and the North Atlantic region”, not by a “dominant Pacific Ocean SST control” as suggested by SST time series off California and by the Devils Hole δ18O time series from the southern Great Basin. We do not find their central thesis supportable.

  2. Audio-based performance evaluation of squash players

    PubMed Central

    Hajdú-Szücs, Katalin; Fenyvesi, Nóra; Vattay, Gábor

    2018-01-01

    In competitive sports it is often very hard to quantify the performance. A player to score or overtake may depend on only millesimal of seconds or millimeters. In racquet sports like tennis, table tennis and squash many events will occur in a short time duration, whose recording and analysis can help reveal the differences in performance. In this paper we show that it is possible to architect a framework that utilizes the characteristic sound patterns to precisely classify the types of and localize the positions of these events. From these basic information the shot types and the ball speed along the trajectories can be estimated. Comparing these estimates with the optimal speed and target the precision of the shot can be defined. The detailed shot statistics and precision information significantly enriches and improves data available today. Feeding them back to the players and the coaches facilitates to describe playing performance objectively and to improve strategy skills. The framework is implemented, its hardware and software components are installed and tested in a squash court. PMID:29579067

  3. The Monitoring Erosion of Agricultural Land and spatial database of erosion events

    NASA Astrophysics Data System (ADS)

    Kapicka, Jiri; Zizala, Daniel

    2013-04-01

    In 2011 originated in The Czech Republic The Monitoring Erosion of Agricultural Land as joint project of State Land Office (SLO) and Research Institute for Soil and Water Conservation (RISWC). The aim of the project is collecting and record keeping information about erosion events on agricultural land and their evaluation. The main idea is a creation of a spatial database that will be source of data and information for evaluation and modeling erosion process, for proposal of preventive measures and measures to reduce negative impacts of erosion events. A subject of monitoring is the manifestations of water erosion, wind erosion and slope deformation in which cause damaged agriculture land. A website, available on http://me.vumop.cz, is used as a tool for keeping and browsing information about monitored events. SLO employees carry out record keeping. RISWC is specialist institute in the Monitoring Erosion of Agricultural Land that performs keeping the spatial database, running the website, managing the record keeping of events, analysis the cause of origins events and statistical evaluations of keeping events and proposed measures. Records are inserted into the database using the user interface of the website which has map server as a component. Website is based on database technology PostgreSQL with superstructure PostGIS and MapServer UMN. Each record is in the database spatial localized by a drawing and it contains description information about character of event (data, situation description etc.) then there are recorded information about land cover and about grown crops. A part of database is photodocumentation which is taken in field reconnaissance which is performed within two days after notify of event. Another part of database are information about precipitations from accessible precipitation gauges. Website allows to do simple spatial analysis as are area calculation, slope calculation, percentage representation of GAEC etc.. Database structure was designed on the base of needs analysis inputs to mathematical models. Mathematical models are used for detailed analysis of chosen erosion events which include soil analysis. Till the end 2012 has had the database 135 events. The content of database still accrues and gives rise to the extensive source of data that is usable for testing mathematical models.

  4. Methods of and apparatus for recording images occurring just prior to a rapid, random event

    DOEpatents

    Kelley, Edward F.

    1994-01-01

    An apparatus and a method are disclosed for recording images of events in a medium wherein the images that are recorded are of conditions existing just prior to and during the occurrence of an event that triggers recording of these images. The apparatus and method use an optical delay path that employs a spherical focusing mirror facing a circular array of flat return mirrors around a central flat mirror. The image is reflected in a symmetric pattern which balances astigmatism which is created by the spherical mirror. Delays on the order of hundreds of nanoseconds are possible.

  5. How Doth the Little Crocodilian: Analyzing the Influence of Environmental Viscosity on Feeding Performance of Juvenile Alligator mississippiensis

    PubMed Central

    Kerfoot, James R.; Easter, Emily; Elsey, Ruth M.

    2016-01-01

    Wetland habitats are used as nursery sites for hatchling and juvenile alligators (Alligator mississippiensis), where they utilize prey from aquatic and terrestrial settings. However, little is known about how viscosity of the medium influences feeding performance. We hypothesized that timing and linear excursion feeding kinematic variables would be different for individuals feeding on prey above the water compared with the same individuals feeding underwater. Individuals were fed immobile fish prey and feeding events were recorded using a high speed video camera. Feeding performance was summarized by analyzing three feeding kinematic variables (maximum gape, maximum gape velocity, duration of feeding bout) and success of strike. Results of a series of paired t-tests indicated no significant difference in kinematic variables between feeding events above water compared with underwater. Similarity in feeding performance could indicate that prey-capture is not altered by environmental viscosity or that feeding behavior can mitigate its influence. Behavioral differences were observed during feeding events with alligators approaching underwater prey having their mouths partially opened versus fully closed when feeding above water. This behavior could be an indication of a strategy used to overcome water viscosity. PMID:27706023

  6. Physiological reactivity to nonideographic virtual reality stimuli in veterans with and without PTSD.

    PubMed

    Webb, Andrea K; Vincent, Ashley L; Jin, Alvin B; Pollack, Mark H

    2015-02-01

    Post-traumatic stress disorder (PTSD) currently is diagnosed via clinical interview in which subjective self reports of traumatic events and associated experiences are discussed with a mental health professional. The reliability and validity of diagnoses can be improved with the use of objective physiological measures. In this study, physiological activity was recorded from 58 male veterans (PTSD Diagnosis n = 16; Trauma Exposed/No PTSD Diagnosis: n = 23; No Trauma/No PTSD Diagnosis: n = 19) with and without PTSD and combat trauma exposure in response to emotionally evocative non-idiographic virtual reality stimuli. Statistically significant differences among the Control, Trauma, and PTSD groups were present during the viewing of two virtual reality videos. Skin conductance and interbeat interval features were extracted for each of ten video events (five events of increasing severity per video). These features were submitted to three stepwise discriminant function analyses to assess classification accuracy for Control versus Trauma, Control versus PTSD, and Trauma versus PTSD pairings of participant groups. Leave-one-out cross-validation classification accuracy was between 71 and 94%. These results are promising and suggest the utility of objective physiological measures in assisting with PTSD diagnosis.

  7. Refinements to the method of epicentral location based on surface waves from ambient seismic noise: introducing Love waves

    USGS Publications Warehouse

    Levshin, Anatoli L.; Barmin, Mikhail P.; Moschetti, Morgan P.; Mendoza, Carlos; Ritzwoller, Michael H.

    2012-01-01

    The purpose of this study is to develop and test a modification to a previous method of regional seismic event location based on Empirical Green’s Functions (EGFs) produced from ambient seismic noise. Elastic EGFs between pairs of seismic stations are determined by cross-correlating long ambient noise time-series recorded at the two stations. The EGFs principally contain Rayleigh- and Love-wave energy on the vertical and transverse components, respectively, and we utilize these signals between about 5 and 12 s period. The previous method, based exclusively on Rayleigh waves, may yield biased epicentral locations for certain event types with hypocentral depths between 2 and 5 km. Here we present theoretical arguments that show how Love waves can be introduced to reduce or potentially eliminate the bias. We also present applications of Rayleigh- and Love-wave EGFs to locate 10 reference events in the western United States. The separate Rayleigh and Love epicentral locations and the joint locations using a combination of the two waves agree to within 1 km distance, on average, but confidence ellipses are smallest when both types of waves are used.

  8. Microstructural integrity of a pathway connecting the prefrontal cortex and amygdala moderates the association between cognitive reappraisal and negative emotions.

    PubMed

    d'Arbeloff, Tracy C; Kim, M Justin; Knodt, Annchen R; Radtke, Spenser R; Brigidi, Bartholomew D; Hariri, Ahmad R

    2018-05-21

    Cognitive reappraisal is a commonly used form of emotion regulation that utilizes frontal-executive control to reframe an approaching emotional event to moderate its potential psychological impact. Use of cognitive reappraisal has been associated with diminished experience of anxiety and depressive symptoms, as well as greater overall well-being. Using data from a study of 647 healthy young adults, we provide initial evidence that an association between typical use of cognitive reappraisal in daily life and the experience of anxiety and depressive symptoms is moderated by the microstructural integrity of the uncinate fasciculus, which provides a major anatomical link between the amygdala and prefrontal cortex. Our findings are consistent with the nature of top-down regulation of bottom-up negative emotions and suggest the uncinate fasciculus may be a useful target in the search for biomarkers predicting not only disorder risk but also response to psychotherapy utilizing cognitive reappraisal. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Sunrayce 97 Finish Sets Records

    Science.gov Websites

    in Indianapolis,, Sunrayce 97 roared to a record finish in Colorado Springs. Winning the event , followed closely by the University of Waterloo, and the University of Minnesota. The event started in

  10. A High-Resolution Stalagmite Holocene Paleoclimate Record from Northern Venezuela with Insights into the Timing and Duration of the 8.2 ka Event

    NASA Astrophysics Data System (ADS)

    Retrum, J. B.; Gonzalez, L. A.; Edwards, R.; Cheng, H.; Tincher, S. M.; Urbani, F.

    2013-12-01

    The dearth of studies and data in the tropics hinders our understanding of atmospheric and oceanic interactions between the low latitudes and the rest of the globe. To understand better the interactions, specifically between the Caribbean and the North Atlantic, three stalagmites were collected from Cueva Zarraga in the Falcón Mountains of northwestern Venezuela and analyzed to determine local paleoclimatic history. Stalagmites ages were determined by U/Th disequilibrium and show a nearly complete Holocene record. The stalagmites have an average temporal resolution of 10.8 years/mm and ranges from 2.1 to 62.7 years. Both the carbon and oxygen isotope records preserve quasi-millennial oscillations and show a major depletion shift from the last glacial period into the Holocene, suggesting warmer and wetter conditions during the Holocene. The preservation of quasi-millennial oscillations and of high frequency multi-decadal changes by the δ13C indicates that the soil-vegetation-stalagmite system is acting as an amplifier of the climatic signal produced by climatic events and changes. In the early Holocene, the δ18O record shows a depletion trend from ~ 11,000 to 8,000 cal yr BP before reaching the Holocene Thermal Maximum. A prominent δ18O enrichment event is recorded in all the stalagmites that correspond to the 8.2 ka event. The 8.2 ka event is represented by a double peak with duration of ~ 180 years. Other short-term δ18O enrichment events likely correspond to Bond events 1, 2, 5, and 6. The late Holocene record, like other Caribbean records, indicates that the climate system diverges from insolation and may represent an atmospheric rearrangement that resulted in ENSO increase instability or in reduced seasonal movement of the Inter-Tropical Convergence Zone (ITCZ). Today, Cueva Zarraga is at the northern extent of the ITCZ and has two rainy seasons. The δ18O enrichment events during the Holocene suggest drier conditions southern displacement of the ITCZ, also suggested by Brazilian speleothem records that show trends that anti-correlate with Cueva Zarraga. The Cariaco Basin and Cueva Zarraga records show similar trends. The close proximity of Cueva Zarraga to Cariaco Basin may allow for a high-resolution tropical terrestrial and oceanic climatic response comparison.

  11. Discrimination of Man-Made Events and Tectonic Earthquakes in Utah Using Data Recorded at Local Distances

    NASA Astrophysics Data System (ADS)

    Tibi, R.; Young, C. J.; Koper, K. D.; Pankow, K. L.

    2017-12-01

    Seismic event discrimination methods exploit the differing characteristics—in terms of amplitude and/or frequency content—of the generated seismic phases among the event types to be classified. Most of the commonly used seismic discrimination methods are designed for regional data recorded at distances of about 200 to 2000 km. Relatively little attention has focused on discriminants for local distances (< 200 km), the range at which the smallest events are recorded. Short-period fundamental mode Rayleigh waves (Rg) are commonly observed on seismograms of man-made seismic events, and shallow, naturally occurring tectonic earthquakes recorded at local distances. We leverage the well-known notion that Rg amplitude decreases dramatically with increasing event depth to propose a new depth discriminant based on Rg-to-Sg spectral amplitude ratios. The approach is successfully used to discriminate shallow events from deeper tectonic earthquakes in the Utah region recorded at local distances (< 150 km) by the University of Utah Seismographic Stations (UUSS) regional seismic network. Using Mood's median test, we obtained probabilities of nearly zero that the median Rg-to-Sg spectral amplitude ratios are the same between shallow events on one side (including both shallow tectonic earthquakes and man-made events), and deeper earthquakes on the other side, suggesting that there is a statistically significant difference in the estimated Rg-to-Sg ratios between the two populations. We also observed consistent disparities between the different types of shallow events (e.g., explosions vs. mining-induced events), implying that it may be possible to separate the sub-populations that make up this group. This suggests that using local distance Rg-to-Sg spectral amplitude ratios one can not only discriminate shallow from deeper events, but may also be able to discriminate different populations of shallow events. We also experimented with Pg-to-Sg amplitude ratios in multi-frequency linear discriminant functions to classify man-made events and tectonic earthquakes in Utah. Initial results are very promising, showing probabilities of misclassification of only 2.4-14.3%.

  12. An analytical approach for estimating fossil record and diversification events in sharks, skates and rays.

    PubMed

    Guinot, Guillaume; Adnet, Sylvain; Cappetta, Henri

    2012-01-01

    Modern selachians and their supposed sister group (hybodont sharks) have a long and successful evolutionary history. Yet, although selachian remains are considered relatively common in the fossil record in comparison with other marine vertebrates, little is known about the quality of their fossil record. Similarly, only a few works based on specific time intervals have attempted to identify major events that marked the evolutionary history of this group. Phylogenetic hypotheses concerning modern selachians' interrelationships are numerous but differ significantly and no consensus has been found. The aim of the present study is to take advantage of the range of recent phylogenetic hypotheses in order to assess the fit of the selachian fossil record to phylogenies, according to two different branching methods. Compilation of these data allowed the inference of an estimated range of diversity through time and evolutionary events that marked this group over the past 300 Ma are identified. Results indicate that with the exception of high taxonomic ranks (orders), the selachian fossil record is by far imperfect, particularly for generic and post-Triassic data. Timing and amplitude of the various identified events that marked the selachian evolutionary history are discussed. Some identified diversity events were mentioned in previous works using alternative methods (Early Jurassic, mid-Cretaceous, K/T boundary and late Paleogene diversity drops), thus reinforcing the efficiency of the methodology presented here in inferring evolutionary events. Other events (Permian/Triassic, Early and Late Cretaceous diversifications; Triassic/Jurassic extinction) are newly identified. Relationships between these events and paleoenvironmental characteristics and other groups' evolutionary history are proposed.

  13. Event Rates, Hospital Utilization, and Costs Associated with Major Complications of Diabetes: A Multicountry Comparative Analysis

    PubMed Central

    Clarke, Philip M.; Glasziou, Paul; Patel, Anushka; Chalmers, John; Woodward, Mark; Harrap, Stephen B.; Salomon, Joshua A.

    2010-01-01

    Background Diabetes imposes a substantial burden globally in terms of premature mortality, morbidity, and health care costs. Estimates of economic outcomes associated with diabetes are essential inputs to policy analyses aimed at prevention and treatment of diabetes. Our objective was to estimate and compare event rates, hospital utilization, and costs associated with major diabetes-related complications in high-, middle-, and low-income countries. Methods and Findings Incidence and history of diabetes-related complications, hospital admissions, and length of stay were recorded in 11,140 patients with type 2 diabetes participating in the Action in Diabetes and Vascular Disease (ADVANCE) study (mean age at entry 66 y). The probability of hospital utilization and number of days in hospital for major events associated with coronary disease, cerebrovascular disease, congestive heart failure, peripheral vascular disease, and nephropathy were estimated for three regions (Asia, Eastern Europe, and Established Market Economies) using multiple regression analysis. The resulting estimates of days spent in hospital were multiplied by regional estimates of the costs per hospital bed-day from the World Health Organization to compute annual acute and long-term costs associated with the different types of complications. To assist, comparability, costs are reported in international dollars (Int$), which represent a hypothetical currency that allows for the same quantities of goods or services to be purchased regardless of country, standardized on purchasing power in the United States. A cost calculator accompanying this paper enables the estimation of costs for individual countries and translation of these costs into local currency units. The probability of attending a hospital following an event was highest for heart failure (93%–96% across regions) and lowest for nephropathy (15%–26%). The average numbers of days in hospital given at least one admission were greatest for stroke (17–32 d across region) and heart failure (16–31 d) and lowest for nephropathy (12–23 d). Considering regional differences, probabilities of hospitalization were lowest in Asia and highest in Established Market Economies; on the other hand, lengths of stay were highest in Asia and lowest in Established Market Economies. Overall estimated annual hospital costs for patients with none of the specified events or event histories ranged from Int$76 in Asia to Int$296 in Established Market Economies. All complications included in this analysis led to significant increases in hospital costs; coronary events, cerebrovascular events, and heart failure were the most costly, at more than Int$1,800, Int$3,000, and Int$4,000 in Asia, Eastern Europe, and Established Market Economies, respectively. Conclusions Major complications of diabetes significantly increase hospital use and costs across various settings and are likely to impose a high economic burden on health care systems. Please see later in the article for the Editors' Summary PMID:20186272

  14. ScreenRecorder: A Utility for Creating Screenshot Video Using Only Original Equipment Manufacturer (OEM) Software on Microsoft Windows Systems

    DTIC Science & Technology

    2015-01-01

    class within Microsoft Visual Studio . 2 It has been tested on and is compatible with Microsoft Vista, 7, and 8 and Visual Studio Express 2008...the ScreenRecorder utility assumes a basic understanding of compiling and running C++ code within Microsoft Visual Studio . This report does not...of Microsoft Visual Studio , the ScreenRecorder utility was developed as a C++ class that can be compiled as a library (static or dynamic) to be

  15. Modern geomorphology in a post-glacial landscape and implications for river restoration, eastern Yosemite Valley, Yosemite National Park, USA

    NASA Astrophysics Data System (ADS)

    Minear, J. T.; Wright, S. A.; Roche, J. W.

    2011-12-01

    Yosemite National Park, USA, is one of the most popular national parks in the country with over 3.9 million visitors annually. The majority of tourists visit a relatively small area around the Merced River in scenic eastern Yosemite Valley, which has resulted in degradation to the river and streambanks. The National Park Service is updating the long-term management plan for the Merced River which includes river restoration. A key component determining the success of future river restoration efforts is the transport and supply of sediment. For this study, we investigate the modern geomorphology of the eastern Yosemite Valley region. For the watershed and reach analyses, we draw from a variety of topographic and hydrologic records, including 20-years of data from permanent cross sections, aerial and ground-based LiDAR surveys, and a nearly 100-year hydrologic record. In addition, we utilize hydraulic and sediment transport models to investigate channel velocities, bed shear stress and sediment transport at the reach scale. From the watershed-scale analysis, it is likely that large-scale remnant glacial features exert a primary control on the sediment supply to the study area with relatively small volumes of both suspended and bedload sediment being contributed to the study site. Two of the three major watersheds, Tenaya Creek and the upper Merced River, likely contribute only small amounts of bedload downstream due to low-gradient depositional reaches. Though little-known, the third major watershed, Illilouette Creek, is the only watershed capable of contributing larger amounts of bedload material, though the bedload material is likely contributed only during high flow events. High flows in the Yosemite Valley region have two different distributions: large early winter storm events above the 20-year return interval, and moderate snowmelt flows at and below the 20-year return interval. Sediment transport analyses indicate that bedload transport is dominated by relatively frequent (<2 year) snowmelt flow events and that the coarsest material in the reach (>110 mm) is mobile during these flows. The permanent cross sections record large topographic changes, including infilling at key bars, associated with the 1997 flood, the largest recorded early winter event (100-year return interval). Following snowmelt events post-1997, cross sections are returning to near pre-1997 levels. The cross section data suggest there is likely a disconnect between sediment supplied to the reach and sediment transport, with the majority of sediment supply occurring during large early winter events while the majority of sediment transport occurs during snowmelt events. An implication of our findings for river restoration in this area of the Merced River is that the ability of the channel to rebuild streambanks is relatively low, given the low suspended sediment supply. In contrast, bedload transport is relatively frequent and occurs in significant quantities, suggesting that river restoration involving bed recovery (e.g. recovery of pools formed by riprap or bridges) should be relatively rapid if obstructions are removed.

  16. Evidence for Millennial-Scale Climate Variability in the Surface Waters Above ODP Site 980, NE Atlantic Ocean During the Last Glacial Interval (MIS 4-2)

    NASA Astrophysics Data System (ADS)

    Michaud, J. R.; Cullen, J. L.; McManus, J. F.; Oppo, D. W.

    2004-05-01

    Successful efforts to recover quality high sedimentation rate deep-sea sediment sections from the North Atlantic over the last decade have produced a number of studies demonstrating that climate instability at sub-orbital and even millennial time-scales is a pervasive component of Late Pleistocene North Atlantic climate. This is particularly true during Marine Isotope Stages (MIS) 4-2, i.e., the last glacial interval. One such high sedimentation rate section was recovered at ODP Site 980, Northeast Atlantic Ocean where sedimentation rates during MIS 4-2 exceed 15cm/kyr. Recently, we have begun to generate more detailed records from MIS 4-2 at Site 980 by reducing our sampling interval from 20 to around 2.5 cm, improving the resolution of our records an order of magnitude, from 1200-1300 to 100-200 years. 300 samples were used to generate high resolution records of changes in the input of ice-rafted detritus (IRD), along with limited data documenting changes in the relative abundance of the N. pachyderma, left coiling, which can be evaluated within the context of our previously generated lower resolution planktic and benthic oxygen isotope records used to generate our age model for this interval. Our previously published low resolution IRD record enabled us to identify Heinrich events 1-6 within the sediment interval deposited during the last glacial. Each event is characterized by IRD concentrations ranging from 500 to over 2500 lithic grains >150 microns per gram sediment. Superimposing our new high resolution IRD record reveals that Heinrich events 3,2,1 occurring at approximately 32, 23, and 17 kya, respectively, are each composed of a series of separate abrupt rapid increases in IRD concentrations approaching 1,000 grains per gram. An additional comparable event occurring at approximately 20 kya has also been identified. In the early part of the last glacial H6, H5, and H4 occurring at approximately 66, 47, and 38 kya, respectively, are recorded as much more abrupt and rapid increases in IRD concentrations to 2,000 or greater lithic grains per gram than were observed in our previous record. There are two 5 kyr intervals between H6 and H5 that contain little or no IRD. An additional abrupt IRD event is recorded at approximately 34 kya. Thus, our new IRD record is recording a series of additional episodic increases in IRD concentrations comparable in intensity to the identified Heinrich events. This suggests that ODP Site 980 sediments are recording a series of more closely spaced episodic increases in IRD concentration that can be directly related to the Dansgaard/Oeschger events recorded in Greenland ice cores. Comparison of our preliminary high resolution record of changes in the relative abundance of the polar species N. pachyderma, left coiling, to our IRD record suggests that the input of iceberg bearing waters precedes the increases in the relative abundance of N. pachyderma, left coiling for the early glacial IRD events. Whereas the abrupt increases in N. pachyderma, left coiling seem to occur during the later glacial IRD events. Thus, in the early glacial the influx of icebergs seem to occur before the invasion of cooler surface waters as opposed to the same time later in the glacial.

  17. Development of genomic evaluations for direct measures of health in U.S. Holsteins and their correlations with fitness traits

    USDA-ARS?s Scientific Manuscript database

    The objectives of this research were to estimate variance components for 6 common health events recorded by producers on U.S. dairy farms, as well as investigate correlations with fitness traits currently used for selection. Producer-recorded health event data were available from Dairy Records Manag...

  18. [Implantable ECG recorder revealed the diagnosis in a baby with apparent life-threatening events].

    PubMed

    Hoorntje, T M; Langerak, W; Blokland-Loggers, H E; Sreeram, N

    1999-09-25

    A 14-month-old boy went through episodes of cyanosis and brief loss of consciousness. Extensive investigations failed to lead to a diagnosis, until an implanted ECG recorder revealed ECG abnormalities suggestive of strangulation. Interviews with the father and mother showed that this was indeed the case. The diagnosis of 'Münchhausen by proxy' was made. Psychiatric assistance and home help were called in. The child recovered well. If there is a suspicion of arrhythmia as the cause of apparent life-threatening events, prolonged ECG recordings are necessary. In a clinical environment it is possible to make continuous ECG recordings during a limited period. An insertable recorder allows continuous ECG recordings during a syncopal event and can be used for prolonged monitoring. The patient presented is the youngest infant in the world in whom such a device has been implanted.

  19. Assessing Reliability of Medical Record Reviews for the Detection of Hospital Adverse Events.

    PubMed

    Ock, Minsu; Lee, Sang-il; Jo, Min-Woo; Lee, Jin Yong; Kim, Seon-Ha

    2015-09-01

    The purpose of this study was to assess the inter-rater reliability and intra-rater reliability of medical record review for the detection of hospital adverse events. We conducted two stages retrospective medical records review of a random sample of 96 patients from one acute-care general hospital. The first stage was an explicit patient record review by two nurses to detect the presence of 41 screening criteria (SC). The second stage was an implicit structured review by two physicians to identify the occurrence of adverse events from the positive cases on the SC. The inter-rater reliability of two nurses and that of two physicians were assessed. The intra-rater reliability was also evaluated by using test-retest method at approximately two weeks later. In 84.2% of the patient medical records, the nurses agreed as to the necessity for the second stage review (kappa, 0.68; 95% confidence interval [CI], 0.54 to 0.83). In 93.0% of the patient medical records screened by nurses, the physicians agreed about the absence or presence of adverse events (kappa, 0.71; 95% CI, 0.44 to 0.97). When assessing intra-rater reliability, the kappa indices of two nurses were 0.54 (95% CI, 0.31 to 0.77) and 0.67 (95% CI, 0.47 to 0.87), whereas those of two physicians were 0.87 (95% CI, 0.62 to 1.00) and 0.37 (95% CI, -0.16 to 0.89). In this study, the medical record review for detecting adverse events showed intermediate to good level of inter-rater and intra-rater reliability. Well organized training program for reviewers and clearly defining SC are required to get more reliable results in the hospital adverse event study.

  20. Digitizing the ranges

    NASA Astrophysics Data System (ADS)

    Haddleton, Graham P.

    2001-04-01

    In military research and development or testing there are various fast and dangerous events that need to be recorded and analyzed. High-speed cameras allow the capture of movement too fast to be recognized by the human eye, and provide data that is essential for the analysis and evaluation of such events. High-speed photography is often the only type of instrumentation that can be used to record the parameters demanded by our customers. I will show examples where this applied cinematography is used not only to provide a visual record of events, but also as an essential measurement tool.

  1. Dynamic time warping and machine learning for signal quality assessment of pulsatile signals.

    PubMed

    Li, Q; Clifford, G D

    2012-09-01

    In this work, we describe a beat-by-beat method for assessing the clinical utility of pulsatile waveforms, primarily recorded from cardiovascular blood volume or pressure changes, concentrating on the photoplethysmogram (PPG). Physiological blood flow is nonstationary, with pulses changing in height, width and morphology due to changes in heart rate, cardiac output, sensor type and hardware or software pre-processing requirements. Moreover, considerable inter-individual and sensor-location variability exists. Simple template matching methods are therefore inappropriate, and a patient-specific adaptive initialization is therefore required. We introduce dynamic time warping to stretch each beat to match a running template and combine it with several other features related to signal quality, including correlation and the percentage of the beat that appeared to be clipped. The features were then presented to a multi-layer perceptron neural network to learn the relationships between the parameters in the presence of good- and bad-quality pulses. An expert-labeled database of 1055 segments of PPG, each 6 s long, recorded from 104 separate critical care admissions during both normal and verified arrhythmic events, was used to train and test our algorithms. An accuracy of 97.5% on the training set and 95.2% on test set was found. The algorithm could be deployed as a stand-alone signal quality assessment algorithm for vetting the clinical utility of PPG traces or any similar quasi-periodic signal.

  2. Universal strategies for the DNA-encoding of libraries of small molecules using the chemical ligation of oligonucleotide tags

    PubMed Central

    Litovchick, Alexander; Clark, Matthew A; Keefe, Anthony D

    2014-01-01

    The affinity-mediated selection of large libraries of DNA-encoded small molecules is increasingly being used to initiate drug discovery programs. We present universal methods for the encoding of such libraries using the chemical ligation of oligonucleotides. These methods may be used to record the chemical history of individual library members during combinatorial synthesis processes. We demonstrate three different chemical ligation methods as examples of information recording processes (writing) for such libraries and two different cDNA-generation methods as examples of information retrieval processes (reading) from such libraries. The example writing methods include uncatalyzed and Cu(I)-catalyzed alkyne-azide cycloadditions and a novel photochemical thymidine-psoralen cycloaddition. The first reading method “relay primer-dependent bypass” utilizes a relay primer that hybridizes across a chemical ligation junction embedded in a fixed-sequence and is extended at its 3′-terminus prior to ligation to adjacent oligonucleotides. The second reading method “repeat-dependent bypass” utilizes chemical ligation junctions that are flanked by repeated sequences. The upstream repeat is copied prior to a rearrangement event during which the 3′-terminus of the cDNA hybridizes to the downstream repeat and polymerization continues. In principle these reading methods may be used with any ligation chemistry and offer universal strategies for the encoding (writing) and interpretation (reading) of DNA-encoded chemical libraries. PMID:25483841

  3. mSpray: a mobile phone technology to improve malaria control efforts and monitor human exposure to malaria control pesticides in Limpopo, South Africa

    PubMed Central

    Eskenazi, Brenda; Quirós-Alcalá, Lesliam; Lipsitt, Jonah M.; Wu, Lemuel D.; Kruger, Philip; Ntimbane, Tzundzukani; Nawn, John Burns; Bornman, M. S. Riana; Seto, Edmund

    2015-01-01

    Recent estimates indicate that malaria has led to over half a million deaths worldwide, mostly to African children. Indoor residual spraying (IRS) of insecticides is one of the primary vector control interventions. However, current reporting systems do not obtain precise location of IRS events in relation to malaria cases, which poses challenges for effective and efficient malaria control. This information is also critical to avoid unnecessary human exposure to IRS insecticides. We developed and piloted a mobile-based application (mSpray) to collect comprehensive information on IRS spray events. We assessed the utility, acceptability and feasibility of using mSpray to gather improved homestead- and chemical-level IRS coverage data. We installed mSpray on 10 cell phones with data bundles, and pilot tested it with 13 users in Limpopo, South Africa. Users completed basic information (number of rooms/shelters sprayed; chemical used, etc.) on spray events. Upon submission, this information as well as geographic positioning system coordinates and time/date stamp were uploaded to a Google Drive Spreadsheet to be viewed in real time. We administered questionnaires, conducted focus groups, and interviewed key informants to evaluate the utility of the app. The low-cost, cell phone-based “mSpray” app was learned quickly by users, well accepted and preferred to the current paper-based method. We recorded 2,865 entries (99.1% had a GPS accuracy of 20 m or less) and identified areas of improvement including increased battery life. We also identified a number of logistic and user problems (e.g., cost of cell phones and cellular bundles, battery life, obtaining accurate GPS measures, user errors, etc.) that would need to be overcome before full deployment. Use of cell phone technology could increase the efficiency of IRS malaria control efforts by mapping spray events in relation to malaria cases, resulting in more judicious use of chemicals that are potentially harmful to humans and the environment. PMID:24769412

  4. Hypothetical scenario exercises to improve planning and readiness for drinking water quality management during extreme weather events.

    PubMed

    Deere, Daniel; Leusch, Frederic D L; Humpage, Andrew; Cunliffe, David; Khan, Stuart J

    2017-03-15

    Two hypothetical scenario exercises were designed and conducted to reflect the increasingly extreme weather-related challenges faced by water utilities as the global climate changes. The first event was based on an extreme flood scenario. The second scenario involved a combination of weather events, including a wild forest fire ('bushfire') followed by runoff due to significant rainfall. For each scenario, a panel of diverse personnel from water utilities and relevant agencies (e.g. health departments) formed a hypothetical water utility and associated regulatory body to manage water quality following the simulated extreme weather event. A larger audience participated by asking questions and contributing key insights. Participants were confronted with unanticipated developments as the simulated scenarios unfolded, introduced by a facilitator. Participants were presented with information that may have challenged their conventional experiences regarding operational procedures in order to identify limitations in current procedures, assumptions, and readily available information. The process worked toward the identification of a list of specific key lessons for each event. At the conclusion of each simulation a facilitated discussion was used to establish key lessons of value to water utilities in preparing them for similar future extreme events. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Electronic health record use among cancer patients: Insights from the Health Information National Trends Survey.

    PubMed

    Strekalova, Yulia A

    2017-04-01

    Over 90% of US hospitals provide patients with access to e-copy of their health records, but the utilization of electronic health records by the US consumers remains low. Guided by the comprehensive information-seeking model, this study used data from the National Cancer Institute's Health Information National Trends Survey 4 (Cycle 4) and examined the factors that explain the level of electronic health record use by cancer patients. Consistent with the model, individual information-seeking factors and perceptions of security and utility were associated with the frequency of electronic health record access. Specifically, higher income, prior online information seeking, interest in accessing health information online, and normative beliefs were predictive of electronic health record access. Conversely, poorer general health status and lack of health care provider encouragement to use electronic health records were associated with lower utilization rates. The current findings provide theory-based evidence that contributes to the understanding of the explanatory factors of electronic health record use and suggest future directions for research and practice.

  6. Classification Scheme for Centuries of Reconstructed Streamflow Droughts in Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Stagge, J.; Rosenberg, D. E.

    2017-12-01

    New advances in reconstructing streamflow from tree rings have permitted the reconstruction of flows back to the 1400s or earlier at a monthly, rather than annual, time scale. This is a critical step for incorporating centuries of streamflow reconstructions into water resources planning. Expanding the historical record is particularly important where the observed record contains few of these rare, but potentially disastrous extreme events. We present how a paleo-drought clustering approach was incorporated alongside more traditional water management planning in the Weber River basin, northern Utah. This study used newly developed monthly reconstructions of flow since 1430 CE and defined drought events as flow less than the 50th percentile during at least three contiguous months. Characteristics for each drought event included measures of drought duration, severity, cumulative loss, onset, seasonality, recession rate, and recovery rate. Reconstructed drought events were then clustered by hierarchical clustering to determine distinct drought "types" and the historical event that best represents the centroid of each cluster. The resulting 144 reconstructed drought events in the Weber basin clustered into nine distinct types, of which four were severe enough to potentially require drought management. Using the characteristic drought event for each of the severe drought clusters, water managers were able to estimate system reliability and the historical return frequency for each drought type. Plotting drought duration and severity from centuries of historical reconstructed events alongside observed events and climate change projections further placed recent events into a historical context. For example, the drought of record for the Weber River remains the most severe event in the record with regard to minimum flow percentile (1930, 7 years), but is far from the longest event in the longer historical record, where events beginning in 1658 and 1705 both lasted longer than 13 years. The proposed drought clustering approach provides a powerful tool for merging historical reconstructions, observations, and climate change projections in water resources planning, while also providing a framework to make use of valuable and increasingly available tree-ring reconstructions of monthly streamflow.

  7. How revealing are insertable loop recorders in pediatrics?

    PubMed

    Frangini, Patricia A; Cecchin, Frank; Jordao, Ligia; Martuscello, Maria; Alexander, Mark E; Triedman, John K; Walsh, Edward P; Berul, Charles I

    2008-03-01

    An insertable loop recorder (ILR) in patients with infrequent syncope or palpitations may be useful to decide management strategies, including clinical observation, medical therapy, pacemaker, or implantable cardioverter defibrillator (ICD). We sought to determine the diagnostic utility of the Reveal ILR (Medtronic, Inc., Minneapolis, MN, USA) in pediatric patients. Retrospective review of clinical data, indications, findings, and therapeutic decision in 27 consecutive patients who underwent ILR implantation from 1998-2007. The median age was 14.8 years (2-25 years). Indications were syncope in 24 patients and recurrent palpitations in three. Overall, eight patients had structural heart disease (six congenital heart disease, one hypertrophic cardiomyopathy, one Kawasaki), five had previous documented ventricular arrhythmias with negative evaluation including electrophysiology study, and three patients had QT prolongation. Tilt testing was performed in 10 patients, of which five had neurocardiogenic syncope but recurrent episodes despite medical therapy. After median three months (1-20 months), 17 patients presented with symptoms and the ILR memory was analyzed in 16 (no episode stored in one due to full device memory), showing asystole or transient atrioventricular (AV) block (2), sinus bradycardia (6), or normal sinus rhythm (8). Among asymptomatic patients, 3/10 had intermittent AV block or long pauses, automatically detected and stored by the ILR. In 19 of 20 patients, ILR was diagnostic (95%) and five subsequently underwent pacemaker implantation, while seven patients remained asymptomatic without ILR events. Notably, no life-threatening events were detected. The ILR was explanted in 22 patients after a median of 22 months, two due to pocket infection, 12 for battery depletion and eight after clear documentation of nonmalignant arrhythmia. The ILR in pediatrics is a useful adjunct to other diagnostic studies. Patient selection is critical as the ILR should not be utilized for malignant arrhythmias. A diagnosis is attained in the majority of symptomatic patients, predominantly bradyarrhythmias including pauses and intermittent AV block.

  8. Analysis of recently digitized continuous seismic data recorded during the March-May, 1980, eruption sequence at Mount St. Helens

    NASA Astrophysics Data System (ADS)

    Moran, S. C.; Malone, S. D.

    2013-12-01

    The May 18, 1980, eruption of Mount St. Helens (MSH) was an historic event, both for society and for the field of volcanology. However, our knowledge of the eruption and the precursory period leading up it is limited by the fact that most of the data, particularly seismic recordings, were not kept due to severe limitations in the amount of digital data that could be handled and stored using 1980 computer technology. Because of these limitations, only about 900 digital event files have been available for seismic studies of the March-May seismic sequence out of a total of more than 4,000 events that were counted using paper records. Fortunately, data from a subset of stations were also recorded continuously on a series of 24 analog 14-track IRIG magnetic tapes. We have recently digitized these tapes and time-corrected and cataloged the resultant digital data streams, enabling more in-depth studies of the (almost) complete pre-eruption seismic sequence using modern digital processing techniques. Of the fifteen seismic stations operating near MSH for at least a part of the two months between March 20 and May 18, six stations have relatively complete analog recordings. These recordings have gaps of minutes to days because of radio noise, poor tape quality, or missing tapes. In addition, several other stations have partial records. All stations had short-period vertical-component sensors with very limited dynamic range and unknown response details. Nevertheless, because the stations were at a range of distances and were operated at a range of gains, a variety of earthquake sizes were recorded on scale by at least one station, and therefore a much more complete understanding of the evolution of event types, sizes and character should be achievable. In our preliminary analysis of this dataset we have found over 10,000 individual events as recorded on stations 35-40 km from MSH, spanning a recalculated coda-duration magnitude range of ~1.5 to 4.1, including many M < 3.0 events that are not part of the PNSN catalog. The closest stations (2-7 km from the summit) recorded several times as many events as the more remote stations during the times they were operational, although many signals are clipped. We see a range of event types including long-period events, tremor, and occasional volcano-tectonic earthquakes. The latter group includes small volcano-tectonic events that occurred at depths of > 7 km during the crypto-dome intrusion phase, which were recognized in 1980 but not fully described. In our analysis of the hours to days prior to the May 18 eruption, we find no obvious changes in seismicity that could have been interpreted as a short-term precursor to the May 18 eruption initiation. This new dataset is currently being formatted for permanent archiving in the IRIS Data Management Center, where it will be available for anyone to use.

  9. Deregulation allows new opportunities for utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, T.

    1996-10-01

    The changes electric utilities face today are both scary and exciting. In the past several years utilities have faced uncertainties that have caused major upheaval in their structures and business processes. There has been an increase in the number of mergers and acquisitions as utilities position themselves for competition. many utility employees have faced layoffs, resulting form reengineering and downsizing. Similar events and uncertainties were faced by the airline and telecommunications industries during their transformations form monopolistic to competitive environments. Even though these events have been difficult and unpleasant, there is a bright side. Today`s electric utilities have the opportunitiesmore » to cash in on some innovative new ideas and technologies.« less

  10. Cadmium-isotopic evidence for increasing primary productivity during the Late Permian anoxic event

    NASA Astrophysics Data System (ADS)

    Georgiev, Svetoslav V.; Horner, Tristan J.; Stein, Holly J.; Hannah, Judith L.; Bingen, Bernard; Rehkämper, Mark

    2015-01-01

    Earth's most extreme extinction event near the end of the Late Permian decimated more than 90% of all extant marine species. Widespread and intensive oceanic anoxia almost certainly contributed to the catastrophe, though the driving mechanisms that sustained such conditions are still debated. Of particular interest is whether water column anoxia was a consequence of a 'stagnant ocean', or if it was controlled by increases in nutrient supply, primary productivity, and subsequent heterotrophic respiration. Testing these competing hypotheses requires deconvolving sedimentary/bottom water redox conditions from changes in surface water productivity in marine sediments. We address this issue by studying marine shales from East Greenland and the mid-Norwegian shelf and combining sedimentary redox proxies with cadmium-isotopic analyses. Sedimentary nitrogen-isotopic data, pyrite framboid analyses, and organic and inorganic shale geochemistry reveal sulfidic conditions with vigorous upwelling, and increasingly anoxic conditions with a strengthening upwelling in the Greenland and Norwegian sections, respectively. Detailed analysis of sedimentary metal budgets illustrates that Cd is primarily associated with organic carbon and records primary geochemical signatures, thus enabling reconstruction of surface water nutrient utilization. Cadmium-isotopic analyses of the authigenic shale fraction released by inverse aqua regia digestion yield an average δ114Cd110 of + 0.15 ± 0.01 ‰ (2 SE, n = 12; rel. NIST SRM 3108), indicative of incomplete surface water nutrient utilization up-section. The constant degree of nutrient utilization combined with strong upwelling requires increasing primary productivity - and not oceanic stagnation - to balance the larger nutrient fluxes to both study sites during the development of the Late Permian water column anoxia. Overall, our data illustrate that if bottom water redox and upwelling can be adequately constrained, Cd-isotopic analyses of organic-rich sediments can be used to provide valuable information on nutrient utilization and therefore past productivity.

  11. ldentifying Episodes of Earth Science Phenomena Using a Big-Data Technology

    NASA Technical Reports Server (NTRS)

    Kuo, Kwo-Sen; Oloso, Amidu; Rushing, John; Lin, Amy; Fekete, Gyorgy; Ramachandran, Rahul; Clune, Thomas; Dunny, Daniel

    2014-01-01

    A significant portion of Earth Science investigations is phenomenon- (or event-) based, such as the studies of Rossby waves, volcano eruptions, tsunamis, mesoscale convective systems, and tropical cyclones. However, except for a few high-impact phenomena, e.g. tropical cyclones, comprehensive records are absent for the occurrences or events of these phenomena. Phenomenon-based studies therefore often focus on a few prominent cases while the lesser ones are overlooked. Without an automated means to gather the events, comprehensive investigation of a phenomenon is at least time-consuming if not impossible. We have constructed a prototype Automated Event Service (AES) system that is used to methodically mine custom-defined events in the reanalysis data sets of atmospheric general circulation models. Our AES will enable researchers to specify their custom, numeric event criteria using a user-friendly web interface to search the reanalysis data sets. Moreover, we have included a social component to enable dynamic formation of collaboration groups for researchers to cooperate on event definitions of common interest and for the analysis of these events. An Earth Science event (ES event) is defined here as an episode of an Earth Science phenomenon (ES phenomenon). A cumulus cloud, a thunderstorm shower, a rogue wave, a tornado, an earthquake, a tsunami, a hurricane, or an El Nino, is each an episode of a named ES phenomenon, and, from the small and insignificant to the large and potent, all are examples of ES events. An ES event has a duration (often finite) and an associated geo-location as a function of time; it's therefore an entity embedded in four-dimensional (4D) spatiotemporal space. Earth Science phenomena with the potential to cause massive economic disruption or loss of life often rivet the attention of researchers. But, broader scientific curiosity also drives the study of phenomena that pose no immediate danger, such as land/sea breezes. Due to Earth System's intricate dynamics, we are continuously discovering novel ES phenomena. We generally gain understanding of a given phenomenon by observing and studying individual events. This process usually begins by identifying the occurrences of these events. Once representative events are identified or found, we must locate associated observed or simulated data prior to commencing analysis and concerted studies of the phenomenon. Knowledge concerning the phenomenon can accumulate only after analysis has started. However, as mentioned previously, comprehensive records only exist for a very limited set of high-impact phenomena; aside from these, finding events and locating associated data currently may take a prohibitive amount of time and effort on the part of an individual investigator. The reason for the lack of comprehensive records for most of the ES phenomena is mainly due to the perception that they do not pose immediate and/or severe threat to life and property. Thus they are not consistently tracked, monitored, and catalogued. Many phenomena even lack precise and/or commonly accepted criteria for definitions. Moreover, various Earth Science observations and data have accumulated to a previously unfathomable volume; NASA Earth Observing System Data Information System (EOSDIS) alone archives several petabytes (PB) of satellite remote sensing data, which are steadily increasing. All of these factors contribute to the difficulty of methodically identifying events corresponding to a given phenomenon and significantly impede systematic investigations. We have not only envisioned AES as an environment for identifying customdefined events but also aspired for it to be an interactive environment with quick turnaround time for revisions of query criteria and results, as well as a collaborative environment where geographically distributed experts may work together on the same phenomena. A Big Data technology is thus required for the realization of such a system. In the following, we first introduce the technology selected for AES in the next section. We then demonstrate the utility of AES using a use case, Blizzard, before we conclude.

  12. Simulating Glacial Outburst Lake Releases for Suicide Basin, Mendenhall Glacier, Juneau, Alaska

    NASA Astrophysics Data System (ADS)

    Jacobs, A. B.; Moran, T.; Hood, E. W.

    2017-12-01

    Glacial Lake outbursts from Suicide Basin are recent phenomenon first characterized in 2011. The 2014 event resulted in record river stage and moderate flooding on the Mendenhall River in Juneau. Recognizing that these events can adversely impact residential areas of Juneau's Mendenhall Valley, the Alaska-Pacific River Forecast Center developed a real-time modeling technique capable of forecasting the timing and magnitude of the flood-wave crest due to releases from Suicide Basin. The 2014 event was estimated at about 37,000 acre feet with water levels cresting within 36 hours from the time the flood wave hit Mendenhall Lake. Given the magnitude of possible impacts to the public, accurate hydrological forecasting is essential for public safety and Emergency Managers. However, the data needed to effectively forecast magnitudes of specific jökulhlaup events are limited. Estimating this event as related to river stage depended upon three variables: 1) the timing of the lag between Suicide Basin water level declines and the related rise of Mendenhall Lake, 2) continuous monitoring of Mendenhall Lake water levels, and 3) estimating the total water volume stored in Suicide Basin. Real-time modeling of the event utilized a Time of Concentration hydrograph with independent power equations representing the rising and falling limbs of the hydrograph. The initial accuracy of the model — as forecasted about 24 hours prior to crest — resulted in an estimated crest within 0.5 feet of the actual with a timing error of about six hours later than the actual crest.

  13. Analysis of event data recorder data for vehicle safety improvement

    DOT National Transportation Integrated Search

    2008-04-01

    The Volpe Center performed a comprehensive engineering analysis of Event Data Recorder (EDR) data supplied by the National Highway Traffic Safety Administration (NHTSA) to assess its accuracy and usefulness in crash reconstruction and improvement of ...

  14. The natural mathematics of behavior analysis.

    PubMed

    Li, Don; Hautus, Michael J; Elliffe, Douglas

    2018-04-19

    Models that generate event records have very general scope regarding the dimensions of the target behavior that we measure. From a set of predicted event records, we can generate predictions for any dependent variable that we could compute from the event records of our subjects. In this sense, models that generate event records permit us a freely multivariate analysis. To explore this proposition, we conducted a multivariate examination of Catania's Operant Reserve on single VI schedules in transition using a Markov Chain Monte Carlo scheme for Approximate Bayesian Computation. Although we found systematic deviations between our implementation of Catania's Operant Reserve and our observed data (e.g., mismatches in the shape of the interresponse time distributions), the general approach that we have demonstrated represents an avenue for modelling behavior that transcends the typical constraints of algebraic models. © 2018 Society for the Experimental Analysis of Behavior.

  15. Attributing Historical Changes in Probabilities of Record-Breaking Daily Temperature and Precipitation Extreme Events

    DOE PAGES

    Shiogama, Hideo; Imada, Yukiko; Mori, Masato; ...

    2016-08-07

    Here, we describe two unprecedented large (100-member), longterm (61-year) ensembles based on MRI-AGCM3.2, which were driven by historical and non-warming climate forcing. These ensembles comprise the "Database for Policy Decision making for Future climate change (d4PDF)". We compare these ensembles to large ensembles based on another climate model, as well as to observed data, to investigate the influence of anthropogenic activities on historical changes in the numbers of record-breaking events, including: the annual coldest daily minimum temperature (TNn), the annual warmest daily maximum temperature (TXx) and the annual most intense daily precipitation event (Rx1day). These two climate model ensembles indicatemore » that human activity has already had statistically significant impacts on the number of record-breaking extreme events worldwide mainly in the Northern Hemisphere land. Specifically, human activities have altered the likelihood that a wider area globally would suffer record-breaking TNn, TXx and Rx1day events than that observed over the 2001- 2010 period by a factor of at least 0.6, 5.4 and 1.3, respectively. However, we also find that the estimated spatial patterns and amplitudes of anthropogenic impacts on the probabilities of record-breaking events are sensitive to the climate model and/or natural-world boundary conditions used in the attribution studies.« less

  16. South America Monsoon variability on millennial to multi-centennial time scale during the Holocene in central eastern Brazil

    NASA Astrophysics Data System (ADS)

    Strikis, N. M.; Cruz, F. W.; Cheng, H.; Karmann, I.; Vuille, M.; Edwards, R.; Wang, X.; Paula, M. S.; Novello, V. F.; Auler, A.

    2011-12-01

    A paleoprecipitation reconstruction based on high resolution and well-dated speleothem oxygen isotope records shows that the monsoon precipitation over central eastern Brazil underwent to strong variations on millennial to multi-centennial time-scales during the Holocene. This new record indicates that abrupt events of increase in monsoon precipitation are correlated to Bond events 6, 5 and 4 and also with 8.2 ky event during the early and mid-Holocene, with a mean amplitude of 1.5 % (PDB). The pacing and structure of such events are general consistent with variations in solar activity suggested by atmospheric Δ14 C records. In the late-Holocene, abrupt events of increase in monsoon precipitation peaking at 3.2, 2.7 and 2.3 ky B.P. are approximately synchronous with periods of low solar minima. In this regard, the most prominent event occurred during the late Holocene occurred at ~2.7 ky B.P. In addition, these positive anomalies of the precipitation recorded in central eastern Brazil are also in good agreement with variations in Titicaca lake level. The good correspondence between the speleothem and marine records imply that the variations in the north Atlantic sea surface temperature is the main forcing for abrupt millennial to multi-centennial precipitations variation within the region under influence of South American Monsoon.

  17. Utilization of Subcutaneous Methotrexate in Rheumatoid Arthritis Patients After Failure or Intolerance to Oral Methotrexate: A Multicenter Cohort Study.

    PubMed

    Branco, Jaime C; Barcelos, Anabela; de Araújo, Filipe Pombo; Sequeira, Graça; Cunha, Inês; Patto, José Vaz; Oliveira, Margarida; Mateus, Margarida Pratas; Couto, Maura; Nero, Patrícia; Pinto, Patrícia; Monteiro, Paulo; Castelão, Walter; Félix, Jorge; Ferreira, Diana; Almeida, João; Silva, Maria João

    2016-01-01

    Low-dose weekly methotrexate (MTX) is the mainstay in the therapy of rheumatoid arthritis (RA). It can be given via oral, intramuscular or subcutaneous (SC) route. This study sought to determine the real-world pattern of treatment with SC MTX in Portuguese adult patients with active RA. Utilization of Metoject(®) in Rheumatoid Arthritis (UMAR) was a non-interventional, cohort multicenter study with retrospective data collection. Eligible patients had active RA, at least 18 years of age, and started SC MTX treatment in 2009 or 2010 after failure or intolerance to oral MTX. Data were collected from patient's clinical records. Both non-parametric and parametric survival methods were used to obtain a detailed understanding of SC MTX treatment duration. Fifty patients were included, of which only 9 discontinued SC MTX during the study follow-up period. The probability of discontinuation after 1, 2, and 3 years of treatment of SC MTX treatment is expected to be 6.10%, 8.50%, and 23.20%, respectively. The extrapolated median duration of SC MTX using an exponential model was 106.4 months/8.87 years. Mean dose of SC MTX was 18.36 mg. The reasons for treatment discontinuation were occurrence of adverse events in six patients and lack of efficacy in three. The long treatment duration of SC MTX highlights its excellent tolerability compared to oral MTX, especially concerning the frequent adverse gastrointestinal events of MTX. Furthermore, long MTX treatment duration provides the opportunity to postpone or even avoid expensive therapies with biologics. The results obtained from the UMAR study provide important information for the utilization and public financing of SC MTX in Portugal.

  18. Results from a Nationwide Cohort Temporary Utilization Authorization (ATU) survey of patients in france treated with Pheburane(®) (Sodium Phenylbutyrate) taste-masked granules.

    PubMed

    Kibleur, Yves; Dobbelaere, Dries; Barth, Magalie; Brassier, Anaïs; Guffon, Nathalie

    2014-10-01

    The aim of this study was to describe a nationwide system for pre-marketing follow-up (cohort temporary utilization authorization [ATU] protocol; i.e., 'therapeutic utilization') of a new taste-masked formulation of sodium phenylbutyrate (NaPB) granules (Pheburane(®)) in France and to analyze safety and efficacy in this treated cohort of patients with urea cycle disease (UCD). In October 2012, a cohort ATU was established in France to monitor the use of Pheburane(®) on a named-patient basis. All treated UCD patients were included in a follow-up protocol developed by the Laboratory (Lucane Pharma) and the French Medicines Agency (ANSM), which recorded demographics, dosing characteristics of NaPB, concomitant medications, adverse events, and clinical outcome during the period of treatment. Following the granting of the Marketing Authorization in Europe, the cohort ATU was terminated approximately 1 year after its initiation, as the product was launched on the French market. The ease of administration and acceptability were much better with the new taste-masked formulation than with the previous treatment. No episodes of metabolic decompensation were observed over a treatment period ranging from 3 to 11 months with Pheburane(®) and the range of ammonia and glutamine plasma levels improved and remained within the normal range. In all, no adverse events were reported with Pheburane(®) treatment. The recently developed taste-masked formulation of NaPB granules improved the quality of life for UCD patients. This may translate into improved compliance, efficacy, and safety, which may be demonstrated either in further studies or in the post-marketing use of the product.

  19. 78 FR 17749 - Parts and Accessories Necessary for Safe Operation; Exemption Renewal for Greyhound Lines, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-22

    ... placement of video event recorders at the top of the windshields on its buses. Greyhound may continue to use the video event recorders to increase safety through (1) Identification and remediation of risky... applied for an exemption from 49 CFR 393.60(e)(1) to allow it to install video event [[Page 17750...

  20. Proxy records of Holocene storm events in coastal barrier systems: Storm-wave induced markers

    NASA Astrophysics Data System (ADS)

    Goslin, Jérôme; Clemmensen, Lars B.

    2017-10-01

    Extreme storm events in the coastal zone are one of the main forcing agents of short-term coastal system behavior. As such, storms represent a major threat to human activities concentrated along the coasts worldwide. In order to better understand the frequency of extreme events like storms, climate science must rely on longer-time records than the century-scale records of instrumental weather data. Proxy records of storm-wave or storm-wind induced activity in coastal barrier systems deposits have been widely used worldwide in recent years to document past storm events during the last millennia. This review provides a detailed state-of-the-art compilation of the proxies available from coastal barrier systems to reconstruct Holocene storm chronologies (paleotempestology). The present paper aims (I) to describe the erosional and depositional processes caused by storm-wave action in barrier and back-barrier systems (i.e. beach ridges, storm scarps and washover deposits), (ii) to understand how storm records can be extracted from barrier and back-barrier sedimentary bodies using stratigraphical, sedimentological, micro-paleontological and geochemical proxies and (iii) to show how to obtain chronological control on past storm events recorded in the sedimentary successions. The challenges that paleotempestology studies still face in the reconstruction of representative and reliable storm-chronologies using these various proxies are discussed, and future research prospects are outlined.

  1. Spike detection, characterization, and discrimination using feature analysis software written in LabVIEW.

    PubMed

    Stewart, C M; Newlands, S D; Perachio, A A

    2004-12-01

    Rapid and accurate discrimination of single units from extracellular recordings is a fundamental process for the analysis and interpretation of electrophysiological recordings. We present an algorithm that performs detection, characterization, discrimination, and analysis of action potentials from extracellular recording sessions. The program was entirely written in LabVIEW (National Instruments), and requires no external hardware devices or a priori information about action potential shapes. Waveform events are detected by scanning the digital record for voltages that exceed a user-adjustable trigger. Detected events are characterized to determine nine different time and voltage levels for each event. Various algebraic combinations of these waveform features are used as axis choices for 2-D Cartesian plots of events. The user selects axis choices that generate distinct clusters. Multiple clusters may be defined as action potentials by manually generating boundaries of arbitrary shape. Events defined as action potentials are validated by visual inspection of overlain waveforms. Stimulus-response relationships may be identified by selecting any recorded channel for comparison to continuous and average cycle histograms of binned unit data. The algorithm includes novel aspects of feature analysis and acquisition, including higher acquisition rates for electrophysiological data compared to other channels. The program confirms that electrophysiological data may be discriminated with high-speed and efficiency using algebraic combinations of waveform features derived from high-speed digital records.

  2. A comparison of earthquake backprojection imaging methods for dense local arrays

    NASA Astrophysics Data System (ADS)

    Beskardes, G. D.; Hole, J. A.; Wang, K.; Michaelides, M.; Wu, Q.; Chapman, M. C.; Davenport, K. K.; Brown, L. D.; Quiros, D. A.

    2018-03-01

    Backprojection imaging has recently become a practical method for local earthquake detection and location due to the deployment of densely sampled, continuously recorded, local seismograph arrays. While backprojection sometimes utilizes the full seismic waveform, the waveforms are often pre-processed and simplified to overcome imaging challenges. Real data issues include aliased station spacing, inadequate array aperture, inaccurate velocity model, low signal-to-noise ratio, large noise bursts and varying waveform polarity. We compare the performance of backprojection with four previously used data pre-processing methods: raw waveform, envelope, short-term averaging/long-term averaging and kurtosis. Our primary goal is to detect and locate events smaller than noise by stacking prior to detection to improve the signal-to-noise ratio. The objective is to identify an optimized strategy for automated imaging that is robust in the presence of real-data issues, has the lowest signal-to-noise thresholds for detection and for location, has the best spatial resolution of the source images, preserves magnitude, and considers computational cost. Imaging method performance is assessed using a real aftershock data set recorded by the dense AIDA array following the 2011 Virginia earthquake. Our comparisons show that raw-waveform backprojection provides the best spatial resolution, preserves magnitude and boosts signal to detect events smaller than noise, but is most sensitive to velocity error, polarity error and noise bursts. On the other hand, the other methods avoid polarity error and reduce sensitivity to velocity error, but sacrifice spatial resolution and cannot effectively reduce noise by stacking. Of these, only kurtosis is insensitive to large noise bursts while being as efficient as the raw-waveform method to lower the detection threshold; however, it does not preserve the magnitude information. For automatic detection and location of events in a large data set, we therefore recommend backprojecting kurtosis waveforms, followed by a second pass on the detected events using noise-filtered raw waveforms to achieve the best of all criteria.

  3. A pilot study of an mHealth application for healthcare workers: poor uptake despite high reported acceptability at a rural South African community-based MDR-TB treatment program.

    PubMed

    Chaiyachati, Krisda H; Loveday, Marian; Lorenz, Stephen; Lesh, Neal; Larkan, Lee-Megan; Cinti, Sandro; Friedland, Gerald H; Haberer, Jessica E

    2013-01-01

    As the South African province of KwaZulu-Natal addresses a growing multidrug-resistant tuberculosis (MDR-TB) epidemic by shifting care and treatment from trained specialty centers to community hospitals, delivering and monitoring MDR-TB therapy has presented new challenges. In particular, tracking and reporting adverse clinical events have been difficult for mobile healthcare workers (HCWs), trained health professionals who travel daily to patient homes to administer and monitor therapy. We designed and piloted a mobile phone application (Mobilize) for mobile HCWs that electronically standardized the recording and tracking of MDR-TB patients on low-cost, functional phones. We assess the acceptability and feasibility of using Mobilize to record and submit adverse events forms weekly during the intensive phase of MDR-TB therapy and evaluate mobile HCW perceptions throughout the pilot period. All five mobile HCWs at one site were trained and provided with phones. Utilizing a mixed-methods evaluation, mobile HCWs' usage patterns were tracked electronically for seven months and analyzed. Qualitative focus groups and questionnaires were designed to understand the impact of mobile phone technology on the work environment. Mobile HCWs submitted nine of 33 (27%) expected adverse events forms, conflicting with qualitative results in which mobile HCWs stated that Mobilize improved adverse events communication, helped their daily workflow, and could be successfully expanded to other health interventions. When presented with the conflict between their expressed views and actual practice, mobile HCWs cited forgetfulness and believed patients should take more responsibility for their own care. This pilot experience demonstrated poor uptake by HCWs despite positive responses to using mHealth. Though our results should be interpreted cautiously because of the small number of mobile HCWs and MDR-TB patients in this study, we recommend carefully exploring the motivations of HCWs and technologic enhancements prior to scaling new mHealth initiatives in resource poor settings.

  4. Processing circuitry for single channel radiation detector

    NASA Technical Reports Server (NTRS)

    Holland, Samuel D. (Inventor); Delaune, Paul B. (Inventor); Turner, Kathryn M. (Inventor)

    2009-01-01

    Processing circuitry is provided for a high voltage operated radiation detector. An event detector utilizes a comparator configured to produce an event signal based on a leading edge threshold value. A preferred event detector does not produce another event signal until a trailing edge threshold value is satisfied. The event signal can be utilized for counting the number of particle hits and also for controlling data collection operation for a peak detect circuit and timer. The leading edge threshold value is programmable such that it can be reprogrammed by a remote computer. A digital high voltage control is preferably operable to monitor and adjust high voltage for the detector.

  5. Finding the signal in the noise: Could social media be utilized for early hospital notification of multiple casualty events?

    PubMed Central

    Moore, Sara; Wakam, Glenn; Hubbard, Alan E.; Cohen, Mitchell J.

    2017-01-01

    Introduction Delayed notification and lack of early information hinder timely hospital based activations in large scale multiple casualty events. We hypothesized that Twitter real-time data would produce a unique and reproducible signal within minutes of multiple casualty events and we investigated the timing of the signal compared with other hospital disaster notification mechanisms. Methods Using disaster specific search terms, all relevant tweets from the event to 7 days post-event were analyzed for 5 recent US based multiple casualty events (Boston Bombing [BB], SF Plane Crash [SF], Napa Earthquake [NE], Sandy Hook [SH], and Marysville Shooting [MV]). Quantitative and qualitative analysis of tweet utilization were compared across events. Results Over 3.8 million tweets were analyzed (SH 1.8 m, BB 1.1m, SF 430k, MV 250k, NE 205k). Peak tweets per min ranged from 209–3326. The mean followers per tweeter ranged from 3382–9992 across events. Retweets were tweeted a mean of 82–564 times per event. Tweets occurred very rapidly for all events (<2 mins) and represented 1% of the total event specific tweets in a median of 13 minutes of the first 911 calls. A 200 tweets/min threshold was reached fastest with NE (2 min), BB (7 min), and SF (18 mins). If this threshold was utilized as a signaling mechanism to place local hospitals on standby for possible large scale events, in all case studies, this signal would have preceded patient arrival. Importantly, this threshold for signaling would also have preceded traditional disaster notification mechanisms in SF, NE, and simultaneous with BB and MV. Conclusions Social media data has demonstrated that this mechanism is a powerful, predictable, and potentially important resource for optimizing disaster response. Further investigated is warranted to assess the utility of prospective signally thresholds for hospital based activation. PMID:28982201

  6. Build your own low-cost seismic/bathymetric recorder annotator

    USGS Publications Warehouse

    Robinson, W.

    1994-01-01

    An inexpensive programmable annotator, completely compatible with at least three models of widely used graphic recorders (Raytheon LSR-1811, Raytheon LSR-1807 M, and EDO 550) has been developed to automatically write event marks and print up to sixteen numbers on the paper record. Event mark and character printout intervals, character height and character position are all selectable with front panel switches. Operation is completely compatible with recorders running in either continuous or start-stop mode. ?? 1994.

  7. Reduced high-density lipoprotein cholesterol: A valuable, independent prognostic marker in peripheral arterial disease.

    PubMed

    Martinez-Aguilar, Esther; Orbe, Josune; Fernández-Montero, Alejandro; Fernández-Alonso, Sebastián; Rodríguez, Jose A; Fernández-Alonso, Leopoldo; Páramo, Jose A; Roncal, Carmen

    2017-11-01

    The prognosis of patients with peripheral arterial disease (PAD) is characterized by an exceptionally high risk for myocardial infarction, ischemic stroke, and death; however, studies in search of new prognostic biomarkers in PAD are scarce. Even though low levels of high-density lipoprotein cholesterol (HDL-C) have been associated with higher risk of cardiovascular (CV) complications and death in different atherosclerotic diseases, recent epidemiologic studies have challenged its prognostic utility. The aim of this study was to test the predictive value of HDL-C as a risk factor for ischemic events or death in symptomatic PAD patients. Clinical and demographic parameters of 254 symptomatic PAD patients were recorded. Amputation, ischemic coronary disease, cerebrovascular disease, and all-cause mortality were recorded during a mean follow-up of 2.7 years. Multivariate analyses showed that disease severity (critical limb ischemia) was significantly reduced in patients with normal HDL-C levels compared with the group with low HDL-C levels (multivariate analysis odds ratio, 0.09; 95% confidence interval [CI], 0.03-0.24). A decreased risk for mortality (hazard ratio, 0.46; 95% CI, 0.21-0.99) and major adverse CV events (hazard ratio, 0.38; 95% CI, 0.16-0.86) was also found in patients with normal vs reduced levels of HDL-C in both Cox proportional hazards models and Kaplan-Meier estimates, after adjustment for confounding factors. Reduced HDL-C levels were significantly associated with higher risk for development of CV complications as well as with mortality in PAD patients. These findings highlight the usefulness of this simple test for early identification of PAD patients at high risk for development of major CV events. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  8. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  9. Resolution and Trade-offs in Finite Fault Inversions for Large Earthquakes Using Teleseismic Signals (Invited)

    NASA Astrophysics Data System (ADS)

    Lay, T.; Ammon, C. J.

    2010-12-01

    An unusually large number of widely distributed great earthquakes have occurred in the past six years, with extensive data sets of teleseismic broadband seismic recordings being available in near-real time for each event. Numerous research groups have implemented finite-fault inversions that utilize the rapidly accessible teleseismic recordings, and slip models are regularly determined and posted on websites for all major events. The source inversion validation project has already demonstrated that for events of all sizes there is often significant variability in models for a given earthquake. Some of these differences can be attributed to variations in data sets and procedures used for including signals with very different bandwidth and signal characteristics into joint inversions. Some differences can also be attributed to choice of velocity structure and data weighting. However, our experience is that some of the primary causes of solution variability involve rupture model parameterization and imposed kinematic constraints such as rupture velocity and subfault source time function description. In some cases it is viable to rapidly perform separate procedures such as teleseismic array back-projection or surface wave directivity analysis to reduce the uncertainties associated with rupture velocity, and it is possible to explore a range of subfault source parameterizations to place some constraints on which model features are robust. In general, many such tests are performed, but not fully described, with single model solutions being posted or published, with limited insight into solution confidence being conveyed. Using signals from recent great earthquakes in the Kuril Islands, Solomon Islands, Peru, Chile and Samoa, we explore issues of uncertainty and robustness of solutions that can be rapidly obtained by inversion of teleseismic signals. Formalizing uncertainty estimates remains a formidable undertaking and some aspects of that challenge will be addressed.

  10. Synchronous precipitation reduction in the American Tropics associated with Heinrich 2.

    PubMed

    Medina-Elizalde, Martín; Burns, Stephen J; Polanco-Martinez, Josué; Lases-Hernández, Fernanda; Bradley, Raymond; Wang, Hao-Cheng; Shen, Chuan-Chou

    2017-09-11

    During the last ice age temperature in the North Atlantic oscillated in cycles known as Dansgaard-Oeschger (D-O) events. The magnitude of Caribbean hydroclimate change associated with D-O variability and particularly with stadial intervals, remains poorly constrained by paleoclimate records. We present a 3.3 thousand-year long stalagmite δ 18 O record from the Yucatan Peninsula (YP) that spans the interval between 26.5 and 23.2 thousand years before present. We estimate quantitative precipitation variability and the high resolution and dating accuracy of this record allow us to investigate how rainfall in the region responds to D-O events. Quantitative precipitation estimates are based on observed regional amount effect variability, last glacial paleotemperature records, and estimates of the last glacial oxygen isotopic composition of precipitation based on global circulation models (GCMs). The new precipitation record suggests significant low latitude hydrological responses to internal modes of climate variability and supports a role of Caribbean hydroclimate in helping Atlantic Meridional Overturning Circulation recovery during D-O events. Significant in-phase precipitation reduction across the equator in the tropical Americas associated with Heinrich event 2 is suggested by available speleothem oxygen isotope records.

  11. 24 CFR 2002.13 - Charges for interest and for unsuccessful searches; utilization of Debt Collection Act.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... unsuccessful searches; utilization of Debt Collection Act. 2002.13 Section 2002.13 Housing and Urban... interest and for unsuccessful searches; utilization of Debt Collection Act. (a) Charging interest. HUD will... time will be assessed when the records requested are not found or when the records located are withheld...

  12. Usefulness of syndromic data sources for investigating morbidity resulting from a severe weather event.

    PubMed

    Baer, Atar; Elbert, Yevgeniy; Burkom, Howard S; Holtry, Rekha; Lombardo, Joseph S; Duchin, Jeffrey S

    2011-03-01

    We evaluated emergency department (ED) data, emergency medical services (EMS) data, and public utilities data for describing an outbreak of carbon monoxide (CO) poisoning following a windstorm. Syndromic ED data were matched against previously collected chart abstraction data. We ran detection algorithms on selected time series derived from all 3 data sources to identify health events associated with the CO poisoning outbreak. We used spatial and spatiotemporal scan statistics to identify geographic areas that were most heavily affected by the CO poisoning event. Of the 241 CO cases confirmed by chart review, 190 (78.8%) were identified in the syndromic surveillance data as exact matches. Records from the ED and EMS data detected an increase in CO-consistent syndromes after the storm. The ED data identified significant clusters of CO-consistent syndromes, including zip codes that had widespread power outages. Weak temporal gastrointestinal (GI) signals, possibly resulting from ingestion of food spoiled by lack of refrigeration, were detected in the ED data but not in the EMS data. Spatial clustering of GI-based groupings in the ED data was not detected. Data from this evaluation support the value of ED data for surveillance after natural disasters. Enhanced EMS data may be useful for monitoring a CO poisoning event, if these data are available to the health department promptly. ©2011 American Medical Association. All rights reserved.

  13. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement.

    PubMed

    Fisher, Jason C; Kuenzler, Keith A; Tomita, Sandra S; Sinha, Prashant; Shah, Paresh; Ginsburg, Howard B

    2017-01-01

    Documenting surgical complications is limited by multiple barriers and is not fostered in the electronic health record. Tracking complications is essential for quality improvement (QI) and required for board certification. Current registry platforms do not facilitate meaningful complication reporting. We developed a novel web application that improves accuracy and reduces barriers to documenting complications. We deployed a custom web application that allows pediatric surgeons to maintain case logs. The program includes a module for entering complication data in real time. Reminders to enter outcome data occur at key postoperative intervals to optimize recall of events. Between October 1, 2014, and March 31, 2015, frequencies of surgical complications captured by the existing hospital reporting system were compared with data aggregated by our application. 780 cases were captured by the web application, compared with 276 cases registered by the hospital system. We observed an increase in the capture of major complications when compared to the hospital dataset (14 events vs. 4 events). This web application improved real-time reporting of surgical complications, exceeding the accuracy of administrative datasets. Custom informatics solutions may help reduce barriers to self-reporting of adverse events and improve the data that presently inform pediatric surgical QI. Diagnostic study/Retrospective study. Level III - case control study. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Resilience in Source to Sink Systems: A Millennial Record of Watershed Responses to Disturbance in Loon Lake, Umpqua River Basin, Oregon

    NASA Astrophysics Data System (ADS)

    Guerrero, F. J.; Richardson, K.; Hatten, J. A.

    2017-12-01

    Small mountainous watersheds are disproportionate sources of particulate organic matter (POM) to long-term sinks like lake bottoms and the ocean. Thus, alterations in sediment routing resulting from disturbances (e.g. earthquakes, fires, and timber harvesting) have profound consequences on watershed's (biogeochemical) resilience. The assessment of these biogeochemical impacts is complicated by the episodic signal propagation along these source-to-sink systems and therefore is seldom attempted. We report on a 1500-year record of historical changes in Loon Lake, a local sedimentary sink (1.2 km2) for a 230 km2 watershed in the Oregon Coast Range. Particle size distributions and POM elemental composition (C, N) were sampled at high temporal resolution ( 3 years). Stable isotopic composition and lignin biomarkers were sampled with varying temporal resolution depending on the period analyzed: 1939-2013 (3-year resolution); 515-1939 (15-year resolution). Disturbance history in Loon Lake catchment is recorded as a sequence of event beds deposited in sharp contrast within a matrix of background sedimentation. At least 8 out of 23 event beds were associated with >8.2 magnitude earthquakes (including the 9.0 megathrust earthquake in 1700). Forest fires in 1770 and 1890 were also recorded as event beds. After 1939, event beds record the impacts of landscape destabilization due to the interaction between intense storms and timber harvesting. At the onset of each event, %C, %N, and C:N ratios increased reflecting the input of coarse POM from surficial soil horizons. Top layers bracketing event beds are rich in clays and have low %C, suggesting a deep-soil sediment source. Isotopic signatures (i.e. δ13C, δ15N) confirm the allochthony of sediment inputs during events and lignin biomarkers suggest a replacement of riparian inputs by a strong gymnosperm signal, particularly after 1945. Thus, event beds record changes in the relative importance of different sediment sources within the catchment as they connect with their sink on the lake bottom. In contrast with continuous records of ecosystem changes from small watersheds, discontinuous records suggest the need for resilience assessments that go beyond the reconstruction of recovery paths to consider source to sink connectivity in small mountainous watersheds.

  15. Quantitative and Qualitative Analyzes of the Explosive Cyclones that Reached the Antarctic Coast in the First Half of 2017

    NASA Astrophysics Data System (ADS)

    Pires, L. B. M.; Romao, M.; Freitas, A. C. V.

    2017-12-01

    An explosive cyclone is a kind of extratropical cyclone which shows a drop in pressure of at least 24 hPa in 24 hours. These are usually intense and they have rapid displacement which hinders their predictability. It is likely that climate change is causing an increase in this type of event in the Antarctic coast and, if this increase is confirmed, the regime of winds and temperatures may be changing. If there are more incidences of explosive cyclones, probably the Antarctic winds are becoming more intense and the temperatures in some places are becoming lower and in others are becoming higher. In the northern portion of the Antarctic Peninsula a decrease in temperature already has been recorded over the last 15 years, while a higher incidence of explosive cyclones over the region also has been found during this period. Studies also have suggested that the drop in temperatures in the Antarctic may be associated with the changes in wind direction, but the cause of these wind direction changes is unknown. Explosive cyclones, which change the wind patterns when they reach certain areas therefore may be contributing to this change in the Antarctic climate. This study is part of the "Explosive Cyclones on the Antarctic Coast" (EXCANC) Project conducted by the World Environmental Conservancy organization. This project analyzes data from meteorological stations strategically scattered throughout the coast and operated by various international Antarctic programs, and also utilizes satellite images. Results show that during the first half of 2017 the highest number of events were recorded at the Australian Casey station with 10 cases, followed by the French station of Dumont D'Urville with 7 cases. Halley's English station recorded its first explosive cyclone this year. Intensity analyzes also are shown.

  16. Assessing the Potential Impact of the 2015-2016 El Niño on the California Rim Fire Burn Scar Through Debris Flow Hazard Mapping

    NASA Astrophysics Data System (ADS)

    Larcom, S.; Grigsby, S.; Ustin, S.

    2015-12-01

    Wildfires are a perennial issue for California, and the current record-breaking drought is exacerbating the potential problems for the state. Fires leave behind burn scars characterized by diminished vegetative cover and abundant bare soil, and these areas are especially susceptible to storm events that pose an elevated risk of debris flows and sediment-rich sheet wash. This study focused on the 2013 Rim Fire that devastated significant portions of Stanislaus National Forest and Yosemite National Park, and utilized readily available NASA JPL SRTM elevation data and AVIRIS spectral imaging data to construct a debris flow hazard map that assesses mass wasting risk for the Rim Fire burn scar. This study consisted entirely of remotely sensed data, which was processed in software programs such as ENVI, GRASS GIS, ArcMap, and Google Earth. Parameters that were taken into consideration when constructing this map include hill slope (greater than 30 percent rise), burn severity (assessed by calculating NDVI), and erodibility of the soil (by comparing spectral reflectance of AVIRIS images with the reference spectra of illite). By calculating percent of total burn area, 6% was classified as low risk, 55% as medium risk, and 39% as high risk. In addition, this study assessed the importance of the 2015-2016 El Niño, which is projected to be one of the strongest on record, by studying historic rainfall records and storm events of past El Niño's. Hydrological and infrastructural problems that could be caused by short-term convective or long-term synoptic storms and subsequent debris flows were explored as well.

  17. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  18. Paleoseismic potential of sublacustrine landslide records in a high-seismicity setting (south-central Alaska)

    USGS Publications Warehouse

    Praet, Nore; Moernaut, Jasper; Van Daele, Maarten; Boes, Evelien; Haeussler, Peter J.; Strupler, Michael; Schmidt, Sabine; Loso, Michael G.; De Batist, Marc

    2017-01-01

    Sublacustrine landslide stratigraphy is considered useful for quantitative paleoseismology in low-seismicity settings. However, as the recharging of underwater slopes with sediments is one of the factors that governs the recurrence of slope failures, it is not clear if landslide deposits can provide continuous paleoseismic records in settings of frequent strong shaking. To test this, we selected three lakes in south-central Alaska that experienced a strong historical megathrust earthquake (the 1964 Mw9.2 Great Alaska Earthquake) and exhibit high sedimentation rates in their main basins (0.2 cm yr-1 -1.0 cm yr-1). We present high-resolution reflection seismic data (3.5 kHz) and radionuclide data from sediment cores in order to investigate factors that control the establishment of a reliable landslide record. Seismic stratigraphy analysis reveals the presence of several landslide deposits in the lacustrine sedimentary infill. Most of these landslide deposits can be attributed to specific landslide events, as multiple landslide deposits sourced from different lacustrine slopes occur on a single stratigraphic horizon. We identify numerous events in the lakes: Eklutna Lake proximal basin (14 events), Eklutna Lake distal basin (8 events), Skilak Lake (7 events) and Kenai Lake (7 events). The most recent event in each basin corresponds to the historic 1964 megathrust earthquake. All events are characterized by multiple landslide deposits, which hints at a regional trigger mechanism, such as an earthquake (the synchronicity criterion). This means that the landslide record in each basin represents a record of past seismic events. Based on extrapolation of sedimentation rates derived from radionuclide dating, we roughly estimate a mean recurrence interval in the Eklutna Lake proximal basin, Eklutna Lake distal basin, Skilak Lake and Kenai Lake, at ~ 250 yrs, ~ 450 yrs, ~ 900 yrs and ~ 450 yrs, respectively. This distinct difference in recording can be explained by variations in preconditioning factors like slope angle, slope recharging (sedimentation rate) and the sediment source area: faster slope recharging and a predominance of delta and alluvial fan failures, increase the sensitivity and lower the intensity threshold for slope instability. Also, the seismotectonic setting of the lakes has to be taken into account. This study demonstrates that sublacustrine landslides in several Alaskan lakes can be used as reliable recorders of strong earthquake shaking, when a multi-lake approach is used, and can enhance the temporal and spatial resolution of the paleoseismic record of south-central Alaska.

  19. Broadband Array Analysis of the 2005 Episodic Tremor and Slip Event in Northern Cascadia

    NASA Astrophysics Data System (ADS)

    Wech, A.; Creager, K.; McCausland, W.; Frassetto, A.; Qamar, A.; Derosier, S.; Carmichael, J.; Malone, S.; Johnson, D.

    2005-12-01

    The region of Cascadia from the Olympic Mountains through southern Vancouver Island and down-dip of the subduction megathrust has repeatedly experienced episodes of slow slip. This episodic slip, which has been observed to take place over a period of two to several weeks, is accompanied by a seismic tremor signal. Based on the average recurrence interval of 14 months, the next episodic tremor and slip (ETS) event should occur within six weeks of mid-September, 2005. Indeed, it appears to have begun on September 3, as this abstract was being written. In order to record this anticipated event, we deployed an array of 11 three-component seismometers on the northern side of the Olympic Peninsula augmenting Pacific Northwest Seismographic Network stations as well as the first few EarthScope BigFoot stations and Plate Boundary Observatory borehole seismometers. This seismic array was comprised of six short-period and five broadband instruments with spacings of 500 m and 2200 m respectively. In conjunction with this Earthscope seismic deployment, we also installed a dense network of 29 temporary, continuous GPS stations across the entire Olympic Peninsula to integrate seismic and geodetic observations. One of the primary goals of this research is to utilize the broadband instrumentation in the array to investigate the possible correlation of low frequency energy with the rest of the tremor activity. ETS has been carefully investigated at high-frequency (seismic tremor at 2-6 Hz) and very low-frequency (slip occurring over weeks, observed by GPS). An important goal of this experiment is to investigate the possibility that the tremor generates intermediate, low-frequency signals. Preliminary analysis of short-period array recordings of the July, 2004 ETS event suggests that the tremor displays signs of lower-frequency energy (~0.5 Hz) correlated with its higher frequency activity. Our array should enable us to distinguish low- frequency signals originating in the direction of high-frequency tremor from noise in other directions. We will present an analysis of the low-frequency energy associated with this slip event.

  20. Compilation of Earthquakes from 1850-2007 within 200 miles of the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Seth Carpenter

    2010-07-01

    An updated earthquake compilation was created for the years 1850 through 2007 within 200 miles of the Idaho National Laboratory. To generate this compilation, earthquake catalogs were collected from several contributing sources and searched for redundant events using the search criteria established for this effort. For all sets of duplicate events, a preferred event was selected, largely based on epicenter-network proximity. All unique magnitude information for each event was added to the preferred event records and these records were used to create the compilation referred to as “INL1850-2007”.

  1. Real-world perceptions of emerging event data recorder (EDR) technologies

    DOT National Transportation Integrated Search

    2002-01-01

    This research focuses on what college-age motorists perceive to be the positive and negative aspects of implementing on-board Event Data Recorders (EDRs) in the highway mode of transport. The achievements and findings offer safety researchers insi...

  2. Video techniques and data compared with observation in emergency trauma care

    PubMed Central

    Mackenzie, C; Xiao, Y

    2003-01-01

    Video recording is underused in improving patient safety and understanding performance shaping factors in patient care. We report our experience of using video recording techniques in a trauma centre, including how to gain cooperation of clinicians for video recording of their workplace performance, identify strengths of video compared with observation, and suggest processes for consent and maintenance of confidentiality of video records. Video records are a rich source of data for documenting clinician performance which reveal safety and systems issues not identified by observation. Emergency procedures and video records of critical events identified patient safety, clinical, quality assurance, systems failures, and ergonomic issues. Video recording is a powerful feedback and training tool and provides a reusable record of events that can be repeatedly reviewed and used as research data. It allows expanded analyses of time critical events, trauma resuscitation, anaesthesia, and surgical tasks. To overcome some of the key obstacles in deploying video recording techniques, researchers should (1) develop trust with video recorded subjects, (2) obtain clinician participation for introduction of a new protocol or line of investigation, (3) report aggregated video recorded data and use clinician reviews for feedback on covert processes and cognitive analyses, and (4) involve multidisciplinary experts in medicine and nursing. PMID:14645896

  3. Biogeochemical Proxies in Scleractinian Corals used to Reconstruct Ocean Circulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guilderson, T.P.; Kashgarian, M.; Schrag, D.P.

    We utilize monthly {sup 14}C data derived from coral archives in conjunction with ocean circulation models to address two questions: (1) how does the shallow circulation of the tropical Pacific vary on seasonal to decadal time scales and (2) which dynamic processes determine the mean vertical structure of the equatorial Pacific thermocline. Our results directly impact the understanding of global climate events such as the El Nino-Southern Oscillation (ENSO). To study changes in ocean circulation and water mass distribution involved in the genesis and evolution of ENSO and decadal climate variability, it is necessary to have records of climate variablesmore » several decades in length. Continuous instrumental records are limited because technology for continuous monitoring of ocean currents has only recently been available, and ships of opportunity archives such as COADS contain large spatial and temporal biases. In addition, temperature and salinity in surface waters are not conservative and thus can not be independently relied upon to trace water masses, reducing the utility of historical observations. Radiocarbon ({sup 14}C) in sea water is a quasi-conservative water mass tracer and is incorporated into coral skeletal material, thus coral {sup 14}C records can be used to reconstruct changes in shallow circulation that would be difficult to characterize using instrumental data. High resolution {Delta}{sup 14}C timeseries such as these, provide a powerful constraint on the rate of surface ocean mixing and hold great promise to augment onetime surveys such as GEOSECS and WOCE. These data not only provide fundamental information about the shallow circulation of the Pacific, but can be used as a benchmark for the next generation of high resolution ocean models used in prognosticating climate change.« less

  4. Integrating artificial intelligence with real-time intracranial EEG monitoring to automate interictal identification of seizure onset zones in focal epilepsy.

    PubMed

    Varatharajah, Yogatheesan; Berry, Brent; Cimbalnik, Jan; Kremen, Vaclav; Van Gompel, Jamie; Stead, Matt; Brinkmann, Benjamin; Iyer, Ravishankar; Worrell, Gregory

    2018-08-01

    An ability to map seizure-generating brain tissue, i.e. the seizure onset zone (SOZ), without recording actual seizures could reduce the duration of invasive EEG monitoring for patients with drug-resistant epilepsy. A widely-adopted practice in the literature is to compare the incidence (events/time) of putative pathological electrophysiological biomarkers associated with epileptic brain tissue with the SOZ determined from spontaneous seizures recorded with intracranial EEG, primarily using a single biomarker. Clinical translation of the previous efforts suffers from their inability to generalize across multiple patients because of (a) the inter-patient variability and (b) the temporal variability in the epileptogenic activity. Here, we report an artificial intelligence-based approach for combining multiple interictal electrophysiological biomarkers and their temporal characteristics as a way of accounting for the above barriers and show that it can reliably identify seizure onset zones in a study cohort of 82 patients who underwent evaluation for drug-resistant epilepsy. Our investigation provides evidence that utilizing the complementary information provided by multiple electrophysiological biomarkers and their temporal characteristics can significantly improve the localization potential compared to previously published single-biomarker incidence-based approaches, resulting in an average area under ROC curve (AUC) value of 0.73 in a cohort of 82 patients. Our results also suggest that recording durations between 90 min and 2 h are sufficient to localize SOZs with accuracies that may prove clinically relevant. The successful validation of our approach on a large cohort of 82 patients warrants future investigation on the feasibility of utilizing intra-operative EEG monitoring and artificial intelligence to localize epileptogenic brain tissue. Broadly, our study demonstrates the use of artificial intelligence coupled with careful feature engineering in augmenting clinical decision making.

  5. Biogeochemical Proxies in Scleractinian Corals used to Reconstruct Ocean Circulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guilderson, T P; Kashgarian, M; Schrag, D P

    2001-02-23

    We utilize monthly {sup 14}C data derived from coral archives in conjunction with ocean circulation models to address two questions: (1) how does the shallow circulation of the tropical Pacific vary on seasonal to decadal time scales and (2) which dynamic processes determine the mean vertical structure of the equatorial Pacific thermocline. Our results directly impact the understanding of global climate events such as the El Nino-Southern Oscillation (ENSO). To study changes in ocean circulation and water mass distribution involved in the genesis and evolution of ENSO and decadal climate variability, it is necessary to have records of climate variablesmore » several decades in length. Continuous instrumental records are limited because technology for continuous monitoring of ocean currents has only recently been available, and ships of opportunity archives such as COADS contain large spatial and temporal biases. In addition, temperature and salinity in surface waters are not conservative and thus can not be independently relied upon to trace water masses, reducing the utility of historical observations. Radiocarbon ({sup 14}C) in sea water is a quasi-conservative water mass tracer and is incorporated into coral skeletal material, thus coral {sup 14}C records can be used to reconstruct changes in shallow circulation that would be difficult to characterize using instrumental data. High resolution {Delta}{sup 14}C timeseries such as these, provide a powerful constraint on the rate of surface ocean mixing and hold great promise to augment onetime surveys such as GEOSECS and WOCE. These data not only provide fundamental information about the shallow circulation of the Pacific, but can be used as a benchmark for the next generation of high resolution ocean models used in prognosticating climate change.« less

  6. Observations of premonitory acoustic emission and slip nucleation during a stick slip experiment in smooth faulted Westerly granite

    USGS Publications Warehouse

    Thompson, B.D.; Young, R.P.; Lockner, D.A.

    2005-01-01

    To investigate laboratory earthquakes, stick-slip events were induced on a saw-cut Westerly granite sample by triaxial loading at 150 MPa confining pressure. Acoustic emissions (AE) were monitored using an innovative continuous waveform recorder. The first motion of each stick slip was recorded as a large-amplitude AE signal. These events source locate onto the saw-cut fault plane, implying that they represent the nucleation sites of the dynamic failure stick-slip events. The precise location of nucleation varied between events and was probably controlled by heterogeneity of stress or surface conditions on the fault. The initial nucleation diameter of each dynamic instability was inferred to be less than 3 mm. A small number of AE were recorded prior to each macro slip event. For the second and third slip events, premonitory AE source mechanisms mimic the large scale fault plane geometry. Copyright 2005 by the American Geophysical Union.

  7. Collection and Utilization of Animal Carcasses Associated with zoonotic Disease in Tshuapa District, the Democratic Republic of the Congo, 2012.

    PubMed

    Monroe, Benjamin P; Doty, Jeffrey B; Moses, Cynthia; Ibata, Saturnin; Reynolds, Mary; Carroll, Darin

    2015-07-01

    The collection and consumption of animal carcasses is a common activity in forested areas of the Congo River basin and creates sustainability, conservation, and health concerns. Residents of the Tshuapa District reported collecting the remains of 5,878 animals from >30 species when surveyed about their wildlife consumption habits. Carcasses were discovered in varying degrees of decomposition and were often consumed at home or sold in local markets. The most commonly collected animals were Cricetomys gambianus (Northern giant pouched rat), Cercopithecus ascanius (red-tailed monkey), and Heliosciurus rufobrachium (red-legged sun squirrel). Many of the species recorded may be hosts of zoonotic pathogens, creating concern for spillover events.

  8. Nuclear Test Depth Determination with Synthetic Modelling: Global Analysis from PNEs to DPRK-2016

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Stachnik, Joshua; Baker, Ben; Epiphansky, Alexey; Bobrov, Dmitry

    2016-04-01

    Seismic event depth determination is critical for the event screening process at the International Data Center, CTBTO. A thorough determination of the event depth can be conducted mostly through additional special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface making the depth screening criterion not applicable. Further it may result in a heavier workload to manually distinguish between subsurface and deeper crustal events. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the depth phases, cross correlation between observed and theoretic seismograms can provide a basis for the event depth estimation, and so an expansion to the screening process. We applied this approach mostly to events at teleseismic and partially regional distances. The approach was found efficient for the seismic event screening process, with certain caveats related mostly to poorly defined source and receiver crustal models which can shift the depth estimate. An adjustable teleseismic attenuation model (t*) for synthetics was used since this characteristic is not known for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to account for the complex source topography. The software prototype is designed to be used for the Expert Technical Analysis at the IDC. With this, the design effectively reuses the NDC-in-a-Box code and can be comfortably utilized by the NDC users. The package uses Geotool as a front-end for data retrieval and pre-processing. After the event database is compiled, the control is passed to the driver software, running the external processing and plotting toolboxes, which controls the final stage and produces the final result. The modules are mostly Python coded, C-coded (Raysynth3D complex topography regional synthetics) and FORTRAN coded synthetics from the CPS330 software package by Robert Herrmann of Saint Louis University. The extension of this single station depth determination method is under development and uses joint information from all stations participating in processing. It is based on simultaneous depth and moment tensor determination for both short and long period seismic phases. A novel approach recently developed for microseismic event location utilizing only phase waveform information was migrated to a global scale. It should provide faster computation as it does not require intensive synthetic modelling, and might benefit processing noisy signals. A consistent depth estimate for all recent nuclear tests was produced for the vast number of IMS stations (primary and auxiliary) used in processing.

  9. Forensic Seismology: constraints on terrorist bombings

    NASA Astrophysics Data System (ADS)

    Wallace, T. C.; Koper, K. D.

    2002-05-01

    Seismology has long been used as a tool to monitor and investigate explosions, both accidental and intentional. Seismic records can be used to provide a precise chronology of events, estimate the energy release in explosions and produce constraints to test various scenarios for the explosions. Truck bombs are a popular tool of terrorists, and at least two such attacks have been recorded seismically. On August 7, 1998 a truck bomb was detonated near the US embassy in Nairobi, Kenya. The bomb seriously damaging a dozen buildings, injuring more than 4000 people and causing 220 fatalities. The explosion was recorded on a short-period seismometer located north of the blast site; the blast seismogram contained body waves, Rayleigh waves and vibrations associated with the air blast. Modeling of the body and surfaces wave allowed an estimate of the origin time of the bombing, which it turn could be used as a constraint the timing of the air blasts. The speed of the air waves from an explosion depend on the air temperature and the size, or yield, of the explosion. In an effort to fully utilize the seismic recordings from such attacks, we analyzed the seismic records from a series of controlled truck bomb explosions carried out at White Sand Missile Range in New Mexico. We developed a new set of scaling laws that relate seismic and acoustic observations directly to the explosive mass (yield). These relationships give a yield of approximately 3000 kg of TNT equivalent for the Nairobi bomb. The terrorist bombing of the Murrah Federal Building in Oklahoma City in 1995 was also recorded on seismometers. One of these records showed 2 discrete surface wavetrains separated by approximately 10 seconds. Some groups seized on the seismic recordings as evidence that there were 2 explosions, and that the US government was actually behind the bombing. However, the USGS monitored the demolition of the remainder of the Murrah Building and showed that the collapse also produced 2 surface waves. The interpretation is that one group was the fundamental mode Rayleigh wave while the other was either a higher-mode surface wave or a scattered S-wave (Lg like) packet (Holzer et al, 1996). This example illustrates the utility of forensic seismology for testing various hypothesis for the explosions. As the number of permanent and temporarily installed seismometers increase in the next decade, the number of "exotic" sources recorded and investigated is grow dramatically. These studies can be very useful for investigating terrorist attacks, and developing scenarios for the crimes.

  10. Two Extreme Climate Events of the Last 1000 Years Recorded in Himalayan and Andean Ice Cores: Impacts on Humans

    NASA Astrophysics Data System (ADS)

    Thompson, L. G.; Mosley-Thompson, E. S.; Davis, M. E.; Kenny, D. V.; Lin, P.

    2013-12-01

    In the last few decades numerous studies have linked pandemic influenza, cholera, malaria, and viral pneumonia, as well as droughts, famines and global crises, to the El Niño-Southern Oscillation (ENSO). Two annually resolved ice core records, one from Dasuopu Glacier in the Himalaya and one from the Quelccaya Ice Cap in the tropical Peruvian Andes provide an opportunity to investigate these relationships on opposite sides of the Pacific Basin for the last 1000 years. The Dasuopu record provides an annual history from 1440 to 1997 CE and a decadally resolved record from 1000 to 1440 CE while the Quelccaya ice core provides annual resolution over the last 1000 years. Major ENSO events are often recorded in the oxygen isotope, insoluble dust, and chemical records from these cores. Here we investigate outbreaks of diseases, famines and global crises during two of the largest events recorded in the chemistry of these cores, particularly large peaks in the concentrations of chloride (Cl-) and fluoride (Fl-). One event is centered on 1789 to 1800 CE and the second begins abruptly in 1345 and tapers off after 1360 CE. These Cl- and F- peaks represent major droughts and reflect the abundance of continental atmospheric dust, derived in part from dried lake beds in drought stricken regions upwind of the core sites. For Dasuopu the likely sources are in India while for Quelccaya the sources would be the Andean Altiplano. Both regions are subject to drought conditions during the El Niño phase of the ENSO cycle. These two events persist longer (10 to 15 years) than today's typical ENSO events in the Pacific Ocean Basin. The 1789 to 1800 CE event was associated with a very strong El Niño event and was coincidental with the Boji Bara famine resulting from extended droughts that led to over 600,000 deaths in central India by 1792. Similarly extensive droughts are documented in Central and South America. Likewise, the 1345 to 1360 CE event, although poorly documented historically in South America, is concomitant with major droughts in India, the collapse of the Yang Dynasty and the Black Death that eliminated roughly one third of the global population. Understanding the characteristics and drivers of these 'natural' events is critical to design adaptive measures for a world with over seven billion people and a climate system now influenced by human activities.

  11. Unusual Signals Recorded by Ocean Bottom Seismometers in the Caldera of Deception Island Volcano: Biological Activity or Hydrothermally Generated Seismicity?

    NASA Astrophysics Data System (ADS)

    Bowman, D. C.; Wilcock, W. S.

    2011-12-01

    As part of an active source land-sea tomography experiment, ocean bottom seismometers (OBSs) were deployed at Deception Island Volcano, Antarctica, in January 2005. Following the tomography study, three OBSs were left for a month inside the flooded caldera and ten on the outer slopes of the volcano to record seismo-volcanic signals. The OBS sensor package included three-orthogonal 1-Hz geophones but no hydrophone. The OBSs were deployed in water depths of 125 to 143 m inside the caldera and at depths of 119 to 475 m on the volcano's flanks. Only two volcano-tectonic earthquakes and three long period events were recorded by the network. However, the OBSs inside the caldera recorded over 4,500 unusual seismic events. These were detected by only one station at a time and were completely absent from OBSs on the flank of the volcano and from land stations deployed on the island. The signals had a dominant frequency of 5 Hz and were one to ten seconds long. Event activity in the caldera was variable with the number of events per hour ranging from 0 up to 60 and the level of activity decreasing slightly over the study period. We categorize the signals into three types based on waveform characteristics. Type 1 events have an impulsive onset and last 1 to 2 s with characteristics that are consistent with the impulse response of a poorly coupled OBS. Type 2 events typically last 2 to 4 s and comprise a low amplitude initial arrival followed less than a second later by a more energetic second phase that looks a Type 1 event. Type 3 events last up to 10 s and have more complex waveforms that appear to comprise several arrivals of varying amplitudes. Type 1 events are similar to the 'fish-bump' signals reported from previous studies that attributed them to biological activity. The consistent timing and relative amplitudes of the two arrivals for Type 2 events are difficult to explain by animals randomly touching the OBSs. Type 3 events are quite similar in frequency, duration, and signal characteristics to long-period seismic events recorded by an onshore seismic array deployed in an earlier study at Deception Island. Particle motions suggest that Type 3 events may be surface waves while the particle motions for Type 1 and Type 2 events are ambiguous and unlike any signals recorded by land arrays at the volcano. Binomial tests of the event distribution show no significant changes in the rate of events with time of day that would be indicative of a biological source. Since the events are entirely absent in biologically productive waters outside the caldera, we postulate that they may be volcanic signals related to hydrothermal flow across the seafloor in the flooded caldera of Deception Island. Future OBS deployments at Deception Island should include a hydrophone to discriminate unambiguously between biological and volcanic signals.

  12. Electrocorticographic high gamma activity versus electrical cortical stimulation mapping of naming.

    PubMed

    Sinai, Alon; Bowers, Christopher W; Crainiceanu, Ciprian M; Boatman, Dana; Gordon, Barry; Lesser, Ronald P; Lenz, Frederick A; Crone, Nathan E

    2005-07-01

    Subdural electrocorticographic (ECoG) recordings in patients undergoing epilepsy surgery have shown that functional activation is associated with event-related broadband gamma activity in a higher frequency range (>70 Hz) than previously studied in human scalp EEG. To investigate the utility of this high gamma activity (HGA) for mapping language cortex, we compared its neuroanatomical distribution with functional maps derived from electrical cortical stimulation (ECS), which remains the gold standard for predicting functional impairment after surgery for epilepsy, tumours or vascular malformations. Thirteen patients had undergone subdural electrode implantation for the surgical management of intractable epilepsy. Subdural ECoG signals were recorded while each patient verbally named sequentially presented line drawings of objects, and estimates of event-related HGA (80-100 Hz) were made at each recording site. Routine clinical ECS mapping used a subset of the same naming stimuli at each cortical site. If ECS disrupted mouth-related motor function, i.e. if it affected the mouth, lips or tongue, naming could not be tested with ECS at the same cortical site. Because naming during ECoG involved these muscles of articulation, the sensitivity and specificity of ECoG HGA were estimated relative to both ECS-induced impairments of naming and ECS disruption of mouth-related motor function. When these estimates were made separately for 12 electrode sites per patient (the average number with significant HGA), the specificity of ECoG HGA with respect to ECS was 78% for naming and 81% for mouth-related motor function, and equivalent sensitivities were 38% and 46%, respectively. When ECS maps of naming and mouth-related motor function were combined, the specificity and sensitivity of ECoG HGA with respect to ECS were 84% and 43%, respectively. This study indicates that event-related ECoG HGA during confrontation naming predicts ECS interference with naming and mouth-related motor function with good specificity but relatively low sensitivity. Its favourable specificity suggests that ECoG HGA can be used to construct a preliminary functional map that may help identify cortical sites of lower priority for ECS mapping. Passive recordings of ECoG gamma activity may be done simultaneously at all electrode sites without the risk of after-discharges associated with ECS mapping, which must be done sequentially at pairs of electrodes. We discuss the relative merits of these two functional mapping techniques.

  13. Calculation of average landslide frequency using climatic records

    Treesearch

    L. M. Reid

    1998-01-01

    Abstract - Aerial photographs are used to develop a relationship between the number of debris slides generated during a hydrologic event and the size of the event, and the long-term average debris-slide frequency is calculated from climate records using the relation.

  14. Development of requirements and functional specifications for crash event data recorders : final report

    DOT National Transportation Integrated Search

    2004-12-01

    The U.S. DOT has conducted research on the requirements for a Crash Event Data Recorder to facilitate the reconstruction of commercial motor vehicle crashes. This report documents the work performed on the Development of Requirements and Functiona...

  15. Enriching Great Britain's National Landslide Database by searching newspaper archives

    NASA Astrophysics Data System (ADS)

    Taylor, Faith E.; Malamud, Bruce D.; Freeborough, Katy; Demeritt, David

    2015-11-01

    Our understanding of where landslide hazard and impact will be greatest is largely based on our knowledge of past events. Here, we present a method to supplement existing records of landslides in Great Britain by searching an electronic archive of regional newspapers. In Great Britain, the British Geological Survey (BGS) is responsible for updating and maintaining records of landslide events and their impacts in the National Landslide Database (NLD). The NLD contains records of more than 16,500 landslide events in Great Britain. Data sources for the NLD include field surveys, academic articles, grey literature, news, public reports and, since 2012, social media. We aim to supplement the richness of the NLD by (i) identifying additional landslide events, (ii) acting as an additional source of confirmation of events existing in the NLD and (iii) adding more detail to existing database entries. This is done by systematically searching the Nexis UK digital archive of 568 regional newspapers published in the UK. In this paper, we construct a robust Boolean search criterion by experimenting with landslide terminology for four training periods. We then apply this search to all articles published in 2006 and 2012. This resulted in the addition of 111 records of landslide events to the NLD over the 2 years investigated (2006 and 2012). We also find that we were able to obtain information about landslide impact for 60-90% of landslide events identified from newspaper articles. Spatial and temporal patterns of additional landslides identified from newspaper articles are broadly in line with those existing in the NLD, confirming that the NLD is a representative sample of landsliding in Great Britain. This method could now be applied to more time periods and/or other hazards to add richness to databases and thus improve our ability to forecast future events based on records of past events.

  16. A substitution method to improve completeness of events documentation in anesthesia records.

    PubMed

    Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis

    2015-12-01

    AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.

  17. Gradual onset and recovery of the Younger Dryas abrupt climate event in the tropics.

    PubMed

    Partin, J W; Quinn, T M; Shen, C-C; Okumura, Y; Cardenas, M B; Siringan, F P; Banner, J L; Lin, K; Hu, H-M; Taylor, F W

    2015-09-02

    Proxy records of temperature from the Atlantic clearly show that the Younger Dryas was an abrupt climate change event during the last deglaciation, but records of hydroclimate are underutilized in defining the event. Here we combine a new hydroclimate record from Palawan, Philippines, in the tropical Pacific, with previously published records to highlight a difference between hydroclimate and temperature responses to the Younger Dryas. Although the onset and termination are synchronous across the records, tropical hydroclimate changes are more gradual (>100 years) than the abrupt (10-100 years) temperature changes in the northern Atlantic Ocean. The abrupt recovery of Greenland temperatures likely reflects changes in regional sea ice extent. Proxy data and transient climate model simulations support the hypothesis that freshwater forced a reduction in the Atlantic meridional overturning circulation, thereby causing the Younger Dryas. However, changes in ocean overturning may not produce the same effects globally as in Greenland.

  18. Event-recording devices with identification codes

    NASA Technical Reports Server (NTRS)

    Watters, David G. (Inventor); Huestis, David L. (Inventor); Bahr, Alfred J. (Inventor); Vidmar, Robert J. (Inventor)

    2003-01-01

    A recording device allows wireless interrogation to determine its identity and its state. The state indicates whether one or more physical or chemical events have taken place. In effect, the one or more physical or chemical events are recorded by the device. The identity of the device allows it to be distinguished from a number of similar devices. The recording device may be used in an array of devices that allows wireless probing by an interrogation unit. When probed, each device tells the interrogator who it is and what state it is in. The devices allow multiple use and the interrogator may use a logical reset to determine the state of each device. The interrogator can thus easily identify particular items in an array that have reached a particular condition. The device may record the status of each device in a database to maintain a history for each.

  19. Consumer-based technology for distribution of surgical videos for objective evaluation.

    PubMed

    Gonzalez, Ray; Martinez, Jose M; Lo Menzo, Emanuele; Iglesias, Alberto R; Ro, Charles Y; Madan, Atul K

    2012-08-01

    The Global Operative Assessment of Laparoscopic Skill (GOALS) is one validated metric utilized to grade laparoscopic skills and has been utilized to score recorded operative videos. To facilitate easier viewing of these recorded videos, we are developing novel techniques to enable surgeons to view these videos. The objective of this study is to determine the feasibility of utilizing widespread current consumer-based technology to assist in distributing appropriate videos for objective evaluation. Videos from residents were recorded via a direct connection from the camera processor via an S-video output via a cable into a hub to connect to a standard laptop computer via a universal serial bus (USB) port. A standard consumer-based video editing program was utilized to capture the video and record in appropriate format. We utilized mp4 format, and depending on the size of the file, the videos were scaled down (compressed), their format changed (using a standard video editing program), or sliced into multiple videos. Standard available consumer-based programs were utilized to convert the video into a more appropriate format for handheld personal digital assistants. In addition, the videos were uploaded to a social networking website and video sharing websites. Recorded cases of laparoscopic cholecystectomy in a porcine model were utilized. Compression was required for all formats. All formats were accessed from home computers, work computers, and iPhones without difficulty. Qualitative analyses by four surgeons demonstrated appropriate quality to grade for these formats. Our preliminary results show promise that, utilizing consumer-based technology, videos can be easily distributed to surgeons to grade via GOALS via various methods. Easy accessibility may help make evaluation of resident videos less complicated and cumbersome.

  20. New technological developments provide deep-sea sediment density flow insights: the Monterey Coordinated Canyon Experiment

    NASA Astrophysics Data System (ADS)

    O'Reilly, T. C.; Kieft, B.; Chaffey, M. R.; Wolfson-Schwehr, M.; Herlien, R.; Bird, L.; Klimov, D.; Paull, C. K.; Gwiazda, R.; Lundsten, E. M.; Anderson, K.; Caress, D. W.; Sumner, E. J.; Simmons, S.; Parsons, D. R.; Talling, P.; Rosenberger, K. J.; Xu, J.; Maier, K. L.; Gales, J. A.

    2017-12-01

    The Monterey Coordinated Canyon Experiment (CCE) deployed an array of instruments along the Monterey Canyon floor to characterize the structure, velocity and frequency of sediment flows. CCE utilized novel technologies developed at MBARI to capture sediment flow data in unprecedented detail. 1. The Seafloor Instrument Node (SIN) at 1850 meters depth housed 3 ADCPs at 3 different frequencies, CTD, current meter, oxygen optode, fluorometer/backscatter sensor, and logged data at 10 second intervals or faster. The SIN included an acoustic modem for communication with shore through a Wave Glider relay, and provided high-resolution measurements of three flow events during three successive deployments over 1.5 years. 2. Beachball-sized Benthic Event Detectors (BEDs) were deployed on or under the seafloor to measure the characteristics of sediment density flows. Each BED recorded data from a pressure sensor and a 3-axis accelerometer and gyro to characterize motions during transport events (e.g. tumble vs rotation). An acoustic modem capable of operating through more than a meter of sediment enabled communications with a ship or autonomous surface vehicle. Multiple BEDs were deployed at various depths in the canyon during CCE, detecting and measuring many transport events; one BED moved 9 km down canyon in 50 minutes during one event. 3. Wave Glider Hot Spot (HS), equipped with acoustic and RF modems, acted as data relay between SIN, BEDs and shore, and acoustically located BEDs after sediment density flows.. In some cases HS relayed BED motion data to shore within a few hours of the event. HS provided an acoustic console to the SIN, allowing shore-based users to check SIN health and status, perform maintenance, etc. 4. Mapping operations were conducted 4 times at the SIN site to quantify depositional and erosional patterns, utilizing a prototype ultra-high-resolution mapping system on the ROV Doc Ricketts. The system consists of a 400-kHz Reson 7125 multibeam sonar, a 3DatDepth SL1 subsea LiIDAR, two stereo color cameras, and a Kearfott SeaDevil INS. At a survey altitude of 3 m above the bed, the mapping system provides 5-cm resolution multibeam bathymetry, 1-cm resolution lidar bathymetry, and 2-mm resolution photomosaics. We will describe the design and full capabilities of these novel systems.

  1. Passive acoustic monitoring to detect spawning in large-bodied catostomids

    USGS Publications Warehouse

    Straight, Carrie A.; Freeman, Byron J.; Freeman, Mary C.

    2014-01-01

    Documenting timing, locations, and intensity of spawning can provide valuable information for conservation and management of imperiled fishes. However, deep, turbid or turbulent water, or occurrence of spawning at night, can severely limit direct observations. We have developed and tested the use of passive acoustics to detect distinctive acoustic signatures associated with spawning events of two large-bodied catostomid species (River Redhorse Moxostoma carinatum and Robust Redhorse Moxostoma robustum) in river systems in north Georgia. We deployed a hydrophone with a recording unit at four different locations on four different dates when we could both record and observe spawning activity. Recordings captured 494 spawning events that we acoustically characterized using dominant frequency, 95% frequency, relative power, and duration. We similarly characterized 46 randomly selected ambient river noises. Dominant frequency did not differ between redhorse species and ranged from 172.3 to 14,987.1 Hz. Duration of spawning events ranged from 0.65 to 11.07 s, River Redhorse having longer durations than Robust Redhorse. Observed spawning events had significantly higher dominant and 95% frequencies than ambient river noises. We additionally tested software designed to automate acoustic detection. The automated detection configurations correctly identified 80–82% of known spawning events, and falsely indentified spawns 6–7% of the time when none occurred. These rates were combined over all recordings; rates were more variable among individual recordings. Longer spawning events were more likely to be detected. Combined with sufficient visual observations to ascertain species identities and to estimate detection error rates, passive acoustic recording provides a useful tool to study spawning frequency of large-bodied fishes that displace gravel during egg deposition, including several species of imperiled catostomids.

  2. New Perspectives on the Frequency and Importance of Tambora-like Events

    NASA Astrophysics Data System (ADS)

    Verosub, K. L.

    2011-12-01

    The 1815 Tambora eruption is generally accepted as having had a significant impact on global climate. What is not clear is whether any earlier volcanic eruptions of about the same Volcanic Explosivity Index had similar impacts. The tree-ring record suggests that the 1600 eruption of Huaynaputina volcano in Peru may have been one such event. Although the instrumental record for this eruption is minimal, historical sources provide a wealth of data about the climatic impacts of this eruption, Famines in Russia and Estonia, late harvests in central Europe, and the early onset of winter conditions for lakes in Japan and harbors on the Baltic Sea document that 1601 was indeed a particularly cold and fairly wet year. Extensive Spanish and Jesuit archives covering the Americas and parts of Asia, plus records of the imperial court in China and the shogunate in Japan should make it possible to obtain a global record of the human and social impacts of the 1600 eruption. The historical record can also be used to determine whether volcanic eruptions produced anomalously cold conditions in 1258 and 1453. Taken together, these four events imply that the return period for Tambora-like events is actually on the order of 200 years, a figure that is in agreement with estimates from ice core records. Since it has been 196 years since the last such event, it is worth considering what the impacts would be if another event were to occur within the next ten years. In particular, the current global agricultural economy may be less resilient to a very cold year than more regionally-based agriculture was in 1816.

  3. EHR Big Data Deep Phenotyping

    PubMed Central

    Lenert, L.; Lopez-Campos, G.

    2014-01-01

    Summary Objectives Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increases. The growth of genomic data and adoption of Electronic Health Records (EHR) in medicine provides a unique opportunity to integrate phenotype and genotype data into medical records. The method by which collections of clinical findings and other health related data are leveraged to form meaningful phenotypes is an active area of research. Longitudinal data stored in EHRs provide a wealth of information that can be used to construct phenotypes of patients. We focus on a practical problem around data integration for deep phenotype identification within EHR data. The use of big data approaches are described that enable scalable markup of EHR events that can be used for semantic and temporal similarity analysis to support the identification of phenotype and genotype relationships. Conclusions Stead and colleagues’ 2005 concept of using light standards to increase the productivity of software systems by riding on the wave of hardware/processing power is described as a harbinger for designing future healthcare systems. The big data solution, using flexible markup, provides a route to improved utilization of processing power for organizing patient records in genotype and phenotype research. PMID:25123744

  4. Main shock and aftershock records of the 1999 Izmit and Duzce, Turkey earthquakes

    USGS Publications Warehouse

    Celebi, M.; Akkar, Sinan; Gulerce, U.; Sanli, A.; Bundock, H.; Salkin, A.

    2001-01-01

    The August 17, 1999 Izmit (Turkey) earthquake (Mw=7.4) will be remembered as one of the largest earthquakes of recent times that affected a large urban environment (U.S. Geological Survey, 1999). This significant event was followed by many significant aftershocks and another main event (Mw=7.2) that occurred on November 12, 1999 near Duzce (Turkey). The shaking that caused the widespread damage and destruction was recorded by a handful of accelerographs (~30) in the earthquake area operated by different networks. The characteristics of these records show that the recorded peak accelerations, shown in Figure 1, even those from near field stations, are smaller than expected (Çelebi, 1999, 2000). Following this main event, several organizations from Turkey, Japan, France and the USA deployed temporary accelerographs and other aftershock recording hardware. Thus, the number of recording stations in the earthquake affected area was quadrupled (~130). As a result, as seen in Figure 2, smaller magnitude aftershocks yielded larger peak accelerations, indicating that because of the sparse networks, recording of larger motions during the main shock of August 17, 1999 were possibly missed.

  5. Combined adaptive multiple subtraction based on optimized event tracing and extended wiener filtering

    NASA Astrophysics Data System (ADS)

    Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo

    2017-06-01

    The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.

  6. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1991-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

  7. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1989-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

  8. 77 FR 59566 - Event Data Recorders

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-28

    ... DEPARTMENT OF TRANSPORTATION National Highway Traffic Safety Administration 49 CFR Part 563 [Docket No. NHTSA-2012-0099] RIN 2127-AL14 Event Data Recorders Correction In rule document 2012-19580.... 563.8 Data format [Corrected] On page 47557 in the table titled ``Table III--Reported Data Element...

  9. Quantifying the probability of record-setting heat events in the historical record and at different levels of climate forcing

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.

    2017-12-01

    Severe heat provides one of the most direct, acute, and rapidly changing impacts of climate on people and ecostystems. Theory, historical observations, and climate model simulations all suggest that global warming should increase the probability of hot events that fall outside of our historical experience. Given the acutre impacts of extreme heat, quantifying the probability of historically unprecedented hot events at different levels of climate forcing is critical for climate adaptation and mitigation decisions. However, in practice that quantification presents a number of methodological challenges. This presentation will review those methodological challenges, including the limitations of the observational record and of climate model fidelity. The presentation will detail a comprehensive approach to addressing these challenges. It will then demonstrate the application of that approach to quantifying uncertainty in the probability of record-setting hot events in the current climate, as well as periods with lower and higher greenhouse gas concentrations than the present.

  10. Environmental forcing of terrestrial carbon isotope excursion amplification across five Eocene hyperthermals

    NASA Astrophysics Data System (ADS)

    Bowen, G. J.; Abels, H.

    2015-12-01

    Abrupt changes in the isotope composition of exogenic carbon pools accompany many major episodes of global change in the geologic record. The global expression of this change in substrates that reflect multiple carbon pools provides important evidence that many events reflect persistent, global redistribution of carbon between reduced and oxidized stocks. As the diversity of records documenting any event grows, however, discrepancies in the expression of carbon isotope change among substrates are almost always revealed. These differences in magnitude, pace, and pattern of change can complicate interpretations of global carbon redistribution, but under ideal circumstances can also provide additional information on changes in specific environmental and biogeochemical systems that accompanied the global events. Here we evaluate possible environmental influences on new terrestrial records of the negative carbon isotope excursions (CIEs) associated with multiple hyperthermals of the Early Eocene, which show a common pattern of amplified carbon isotope change in terrestrial paleosol carbonate records relative to that recorded in marine substrates. Scaling relationships between climate and carbon-cycle proxies suggest that that the climatic (temperature) impact of each event scaled proportionally with the magnitude of its marine CIE, likely implying that all events involved release of reduced carbon with a similar isotopic composition. Amplification of the terrestrial CIEs, however, does not scale with event magnitude, being proportionally less for the first, largest event (the PETM). We conduct a sensitivity test of a coupled plant-soil carbon isotope model to identify conditions that could account for the observed CIE scaling. At least two possibilities consistent with independent lines of evidence emerge: first, varying effects of pCO2 change on photosynthetic carbon isotope discrimination under changing background pCO2, and second, contrasting changes in regional hydroclimate during the PETM and subsequent hyperthermals. These mechanisms have very different implications for the reconstruction of environmental conditions, and resolving the correct interpretation will require new, complimentary records of plant and soil conditions associated with the Early Eocene hyperthermals.

  11. A media player causes clinically significant telemetry interference with implantable loop recorders.

    PubMed

    Thaker, Jay P; Patel, Mehul B; Shah, Ashok J; Liepa, Valdis V; Jongnarangsin, Krit; Thakur, Ranjan K

    2009-03-01

    The implantable loop recorder is a useful diagnostic tool for intermittent cardiovascular symptoms because it can automatically record arrhythmias as well as a patient-triggered ECG. Media players have been shown to cause telemetry interference with pacemakers. Telemetry interference may be important in patients with implantable loop recorders because capturing a patient-triggered ECG requires a telemetry link between a hand-held activator and the implanted device. The purpose of this study was to determine if a media player causes interference with implantable loop recorders. Fourteen patients with implantable loop recorders underwent evaluation for interference with a 15 GB third generation iPod (Apple, Inc.) media player. All patients had the Reveal Plus (Medtronic, Inc.) implantable loop recorder. We tested for telemetry interference on the programmer by first establishing a telemetry link with the loop recorder and then, the media player was placed next to it, first turned off and then, on. We evaluated for telemetry interference between the activator and the implanted device by placing the activator over the device (normal use) and the media player next to it, first turned off and then, on. We made 5 attempts to capture a patient-triggered ECG by depressing the activator switch 5 times while the media player was off or on. Telemetry interference on the programmer screen, consisting of either high frequency spikes or blanking of the ECG channel was seen in all patients. Telemetry interference with the activator resulted in failure to capture an event in 7 patients. In one of these patients, a green indicator light on the activator suggested that a patient-triggered event was captured, but loop recorder interrogation did not show a captured event. In the remaining 7 patients, an event was captured and appropriately recognized by the device at least 1 out of 5 times. A media player playing in close proximity to an implanted loop recorder may interfere with capture of a patient-triggered event. Patients should be advised to keep media players away from their implanted loop recorder.

  12. Drop Axis Ratio Distributions in Stratiform and Convective Rain

    NASA Technical Reports Server (NTRS)

    Thurai, M.; Bringi, V. N.; Petersen, W. A.; Schultz, C.

    2010-01-01

    A fully calibrated low profile 2D video disdrometer (2DVD) has been recording many different rainfall events in Northern Alabama (USA) since June 2007. An earlier publication reported drop shapes and axis ratio distributions determined for some of the events. For one of the cases examined, a noticeable shift in the 3.5 - 3.75 mm drop axis ratio distribution was noted. In this paper, we extend the earlier work by separating the 2DVD measurements into stratiform and convective rain. The separation is made possible by using the minute-by-minute drop size distribution (DSD) measured by the 2DVD. The 1-minute DSDs are fitted to a gamma distribution, and using a simple indexing technique which involves two of the fitted parameters, periods of convective and stratiform rain are separated for a given event. The output of the DSD indexing technique is qualitatively confirmed by comparing with simultaneous time series observations from a co-located UHF profiler which continuously records height profiles of reflectivity, Doppler mean and spectral width, all of which enable the identification of bright-band periods and, furthermore, periods of moderate and deep convection. Excellent consistency is found between the output of the DSD-based separation method and the profiler observations. Next, we utilize the output of DSD index-based separation method to flag the periods of severe convection for a given event. Drop axis ratios during the flagged periods are derived and compared with those during stratiform rain periods. Five cases have been considered. Axis ratio distributions do not show appreciable differences between stratiform and convective periods for four of the cases. The fifth case (the same case as reported earlier) shows a shift in the 3.5 - 3.75 mm drop axis ratios during a prolonged period of convection. The contoured shapes for these drops determined from the 2DVD camera data indicate the possibility of non-axisymmetric oscillations, compared with the contoured images for other events which fit well to our reference drop shapes. For all of above cases, observations from a C-band polarimetric radar - situated 15 km away are examined. The variations between the co-polar radar reflectivity and the differential reflectivity as well as the specific differential phase are compared with the 2DVD data based scattering calculations for the 5 events. The implications will be discussed.

  13. A plate boundary earthquake record from a wetland adjacent to the Alpine fault in New Zealand refines hazard estimates

    NASA Astrophysics Data System (ADS)

    Cochran, U. A.; Clark, K. J.; Howarth, J. D.; Biasi, G. P.; Langridge, R. M.; Villamor, P.; Berryman, K. R.; Vandergoes, M. J.

    2017-04-01

    Discovery and investigation of millennial-scale geological records of past large earthquakes improve understanding of earthquake frequency, recurrence behaviour, and likelihood of future rupture of major active faults. Here we present a ∼2000 year-long, seven-event earthquake record from John O'Groats wetland adjacent to the Alpine fault in New Zealand, one of the most active strike-slip faults in the world. We linked this record with the 7000 year-long, 22-event earthquake record from Hokuri Creek (20 km along strike to the north) to refine estimates of earthquake frequency and recurrence behaviour for the South Westland section of the plate boundary fault. Eight cores from John O'Groats wetland revealed a sequence that alternated between organic-dominated and clastic-dominated sediment packages. Transitions from a thick organic unit to a thick clastic unit that were sharp, involved a significant change in depositional environment, and were basin-wide, were interpreted as evidence of past surface-rupturing earthquakes. Radiocarbon dates of short-lived organic fractions either side of these transitions were modelled to provide estimates for earthquake ages. Of the seven events recognised at the John O'Groats site, three post-date the most recent event at Hokuri Creek, two match events at Hokuri Creek, and two events at John O'Groats occurred in a long interval during which the Hokuri Creek site may not have been recording earthquakes clearly. The preferred John O'Groats-Hokuri Creek earthquake record consists of 27 events since ∼6000 BC for which we calculate a mean recurrence interval of 291 ± 23 years, shorter than previously estimated for the South Westland section of the fault and shorter than the current interseismic period. The revised 50-year conditional probability of a surface-rupturing earthquake on this fault section is 29%. The coefficient of variation is estimated at 0.41. We suggest the low recurrence variability is likely to be a feature of other strike-slip plate boundary faults similar to the Alpine fault.

  14. The INTIMATE event stratigraphy and recommendations for its use

    NASA Astrophysics Data System (ADS)

    Rasmussen, Sune O.

    2014-05-01

    The North Atlantic INTIMATE (INtegration of Ice-core, MArine and TErrestrial records) group has previously recommended an Event Stratigraphy approach for the synchronisation of records of the Last Termination using the Greenland ice core records as the regional stratotypes. A key element of these protocols has been the formal definition of numbered Greenland Stadials (GS) and Greenland Interstadials (GI) within the past glacial period as the Greenland expressions of the characteristic Dansgaard-Oeschger events that represent cold and warm phases of the North Atlantic region, respectively. Using a recent synchronization of the NGRIP, GRIP, and GISP2 ice cores that allows the parallel analysis of all three records on a common time scale, we here present an extension of the GS/GI stratigraphic template to the entire glacial period. In addition to the well-known sequence of Dansgaard-Oeschger events that were first defined and numbered in the ice core records more than two decades ago, a number of short-lived climatic oscillations have been identified in the three synchronized records. Some of these events have been observed in other studies, but we here propose a consistent scheme for discriminating and naming all the significant climatic events of the last glacial period that are represented in the Greenland ice cores. In addition to presenting the updated event stratigraphy, we make a series of recommendations on how to refer to these periods in a way that promotes unambiguous comparison and correlation between different proxy records, providing a more secure basis for investigating the dynamics and fundamental causes of these climatic perturbations. The work presented is a part of a manuscript under review for publication in Quaternary Science Reviews. Author team: S.O. Rasmussen, M. Bigler, S.P.E. Blockley, T. Blunier, S.L. Buchardt, H.B. Clausen, I. Cvijanovic, D. Dahl-Jensen, S.J. Johnsen, H. Fischer, V. Gkinis, M. Guillevic, W.Z. Hoek, J.J. Lowe, J. Pedro, T. Popp, I.K. Seierstad, J.P. Steffensen, A.M. Svensson, P. Vallelonga, B.M. Vinther, M.J.C. Walker, J.J. Wheatley, and M. Winstrup (deceased).

  15. Paired charcoal and tree-ring records of high-frequency Holocene fire from two New Mexico bog sites

    USGS Publications Warehouse

    Allen, Craig D.; Anderson, R. Scott; Jass, R.B.; Toney, J.L.; Baisan, C.H.

    2008-01-01

    Two primary methods for reconstructing paleofire occurrence include dendrochronological dating of fire scars and stand ages from live or dead trees (extending back centuries into the past) and sedimentary records of charcoal particles from lakes and bogs, providing perspectives on fire history that can extend back for many thousands of years. Studies using both proxies have become more common in regions where lakes are present and fire frequencies are low, but are rare where high-frequency surface fires dominate and sedimentary deposits are primarily bogs and wetlands. Here we investigate sedimentary and fire-scar records of fire in two small watersheds in northern New Mexico, in settings recently characterised by relatively high-frequency fire where bogs and wetlands (Chihuahuen??os Bog and Alamo Bog) are more common than lakes. Our research demonstrates that: (1) essential features of the sedimentary charcoal record can be reproduced between multiple cores within a bog deposit; (2) evidence from both fire-scarred trees and charcoal deposits documents an anomalous lack of fire since ???1900, compared with the remainder of the Holocene; (3) sedimentary charcoal records probably underestimate the recurrence of fire events at these high-frequency fire sites; and (4) the sedimentary records from these bogs are complicated by factors such as burning and oxidation of these organic deposits, diversity of vegetation patterns within watersheds, and potential bioturbation by ungulates. We consider a suite of particular challenges in developing and interpreting fire histories from bog and wetland settings in the Southwest. The identification of these issues and constraints with interpretation of sedimentary charcoal fire records does not diminish their essential utility in assessing millennial-scale patterns of fire activity in this dry part of North America. ?? IAWF 2008.

  16. Depth Discrimination Using Rg-to-Sg Spectral Amplitude Ratios for Seismic Events in Utah Recorded at Local Distances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tibi, Rigobert; Koper, Keith D.; Pankow, Kristine L.

    Short-period fundamental-mode Rayleigh waves (Rg) are commonly observed on seismograms of anthropogenic seismic events and shallow, naturally occurring tectonic earthquakes (TEs) recorded at local distances. In the Utah region, strong Rg waves traveling with an average group velocity of about 1.8 km/s are observed at ~1 Hz on waveforms from shallow events ( depth<10 km ) recorded at distances up to about 150 km. At these distances, Sg waves, which are direct shear waves traveling in the upper crust, are generally the dominant signals for TEs. Here in this study, we leverage the well-known notion that Rg amplitude decreases dramaticallymore » with increasing event depth to propose a new depth discriminant based on Rg-to-Sg spectral amplitude ratios. The approach is successfully used to discriminate shallow events (both earthquakes and anthropogenic events) from deeper TEs in the Utah region recorded at local distances ( <150 km ) by the University of Utah Seismographic Stations (UUSS) regional seismic network. Using Mood’s median test, we obtained probabilities of nearly zero that the median Rg-to-Sg spectral amplitude ratios are the same between shallow events on the one hand (including both shallow TEs and anthropogenic events), and deeper earthquakes on the other, suggesting that there is a statistically significant difference in the estimated Rg-to-Sg ratios between the two populations. We also observed consistent disparities between the different types of shallow events (e.g., mining blasts vs. mining-induced earthquakes), implying that it may be possible to separate the subpopulations that make up this group. Lastly, this suggests that using local distance Rg-to-Sg spectral amplitude ratios one can not only discriminate shallow events from deeper events but may also be able to discriminate among different populations of shallow events.« less

  17. Tropical Pacific climate variability over the last 6000 years as recorded in Bainbridge Crater Lake, Galápagos

    NASA Astrophysics Data System (ADS)

    Thompson, Diane M.; Conroy, Jessica L.; Collins, Aaron; Hlohowskyj, Stephan R.; Overpeck, Jonathan T.; Riedinger-Whitmore, Melanie; Cole, Julia E.; Bush, Mark B.; Whitney, H.; Corley, Timothy L.; Kannan, Miriam Steinitz

    2017-08-01

    Finely laminated sediments within Bainbridge Crater Lake, Galápagos, provide a record of El Niño-Southern Oscillation (ENSO) events over the Holocene. Despite the importance of this sediment record, hypotheses for how climate variability is preserved in the lake sediments have not been tested. Here we present results of long-term monitoring of the local climate and limnology and a revised interpretation of the sediment record. Brown-green, organic-rich, siliciclastic laminae reflect warm, wet conditions typical of El Niño events, whereas carbonate and gypsum precipitate during cool, dry La Niña events and persistent dry periods, respectively. Applying this new interpretation, we find that ENSO events of both phases were generally less frequent during the mid-Holocene ( 6100-4000 calendar years B.P.) relative to the last 1500 calendar years. Abundant carbonate laminations between 3500 and 3000 calendar years B.P. imply that conditions in the Galápagos region were cool and dry during this period when the tropical Pacific E-W sea surface temperature (SST) gradient likely strengthened. The frequency of El Niño and La Niña events then intensified dramatically around 1750-2000 calendar years B.P., consistent with a weaker SST gradient and an increased frequency of ENSO events in other regional records. This strong interannual variability persisted until 700 calendar years B.P., when ENSO-related variability at the lake decreased as the SST gradient strengthened. Persistent, dry conditions then dominated between 300 and 50 calendar years B.P. (A.D. 1650-1900, ± 100 years), whereas wetter conditions and frequent El Niño events dominated in the most recent century.

  18. Correlating carbon and oxygen isotope events in early to middle Miocene shallow marine carbonates in the Mediterranean region using orbitally tuned chemostratigraphy and lithostratigraphy

    PubMed Central

    Piller, Werner E.; Reuter, Markus; Harzhauser, Mathias

    2015-01-01

    Abstract During the Miocene prominent oxygen isotope events (Mi‐events) reflect major changes in glaciation, while carbonate isotope maxima (CM‐events) reflect changes in organic carbon burial, particularly during the Monterey carbon isotope excursion. However, despite their importance to the global climate history they have never been recorded in shallow marine carbonate successions. The Decontra section on the Maiella Platform (central Apennines, Italy), however, allows to resolve them for the first time in such a setting during the early to middle Miocene. The present study improves the stratigraphic resolution of parts of the Decontra section via orbital tuning of high‐resolution gamma ray (GR) and magnetic susceptibility data to the 405 kyr eccentricity metronome. The tuning allows, within the established biostratigraphic, sequence stratigraphic, and isotope stratigraphic frameworks, a precise correlation of the Decontra section with pelagic records of the Mediterranean region, as well as the global paleoclimatic record and the global sea level curve. Spectral series analyses of GR data further indicate that the 405 kyr orbital cycle is particularly well preserved during the Monterey Event. Since GR is a direct proxy for authigenic uranium precipitation during increased burial of organic carbon in the Decontra section, it follows the same long‐term orbital pacing as observed in the carbon isotope records. The 405 kyr GR beat is thus correlated with the carbon isotope maxima observed during the Monterey Event. Finally, the Mi‐events can now be recognized in the δ18O record and coincide with plankton‐rich, siliceous, or phosphatic horizons in the lithology of the section. PMID:27546980

  19. Correlating carbon and oxygen isotope events in early to middle Miocene shallow marine carbonates in the Mediterranean region using orbitally tuned chemostratigraphy and lithostratigraphy

    NASA Astrophysics Data System (ADS)

    Auer, Gerald; Piller, Werner E.; Reuter, Markus; Harzhauser, Mathias

    2015-04-01

    During the Miocene prominent oxygen isotope events (Mi-events) reflect major changes in glaciation, while carbonate isotope maxima (CM-events) reflect changes in organic carbon burial, particularly during the Monterey carbon isotope excursion. However, despite their importance to the global climate history they have never been recorded in shallow marine carbonate successions. The Decontra section on the Maiella Platform (central Apennines, Italy), however, allows to resolve them for the first time in such a setting during the early to middle Miocene. The present study improves the stratigraphic resolution of parts of the Decontra section via orbital tuning of high-resolution gamma ray (GR) and magnetic susceptibility data to the 405 kyr eccentricity metronome. The tuning allows, within the established biostratigraphic, sequence stratigraphic, and isotope stratigraphic frameworks, a precise correlation of the Decontra section with pelagic records of the Mediterranean region, as well as the global paleoclimatic record and the global sea level curve. Spectral series analyses of GR data further indicate that the 405 kyr orbital cycle is particularly well preserved during the Monterey Event. Since GR is a direct proxy for authigenic uranium precipitation during increased burial of organic carbon in the Decontra section, it follows the same long-term orbital pacing as observed in the carbon isotope records. The 405 kyr GR beat is thus correlated with the carbon isotope maxima observed during the Monterey Event. Finally, the Mi-events can now be recognized in the δ18O record and coincide with plankton-rich, siliceous, or phosphatic horizons in the lithology of the section.

  20. A causal relationship between cough and gastroesophageal reflux disease (GERD) has been established: a pro/con debate.

    PubMed

    Kahrilas, Peter J; Smith, Jaclyn A; Dicpinigaitis, Peter V

    2014-02-01

    Along with upper airway cough syndrome (formerly, postnasal drip syndrome) and eosinophilic airway inflammation (asthma, nonasthmatic eosinophilic bronchitis), gastroesophageal reflux disease (GERD) is generally considered among the most common etiologies of chronic cough. Indeed, cough management guidelines published by numerous respiratory societies worldwide recommend evaluation and treatment of GERD as an integral component of the diagnostic/therapeutic algorithm for the management of chronic cough. However, a significant number of patients with chronic cough presumed due to GERD do not report improvement despite aggressive acid-suppressive therapy. Some of these refractory cases may be due to the recently appreciated entity of nonacid or weakly acidic reflux. Further contributing to the controversy are recent studies that demonstrate that patients with chronic cough do not have excessive reflux events relative to healthy volunteers. Although a temporal relationship between cough and reflux events has been suggested by studies utilizing impedance-pH monitoring of reflux events and objective cough recording, consensus is lacking in terms of whether this temporal relationship proves a causal link between reflux and cough. The fourth American Cough Conference (New York, June 2013) provided an ideal forum for the debate of this issue between two internationally recognized experts in the field of reflux and chronic cough.

  1. A Causal Relationship Between Cough and Gastroesophageal Reflux Disease (GERD) Has Been Established: a Pro/Con Debate

    PubMed Central

    Kahrilas, Peter J.; Smith, Jaclyn A.; Dicpinigaitis, Peter V.

    2014-01-01

    Along with upper airway cough syndrome (formerly, postnasal drip syndrome) and eosinophilic airway inflammation (asthma, non-asthmatic eosinophilic bronchitis), gastroesophageal reflux disease (GERD) is generally considered among the most common etiologies of chronic cough. Indeed, cough management guidelines published by numerous respiratory societies worldwide recommend evaluation and treatment of GERD as an integral component of the diagnostic/therapeutic algorithm for the management of chronic cough. However, a significant number of patients with chronic cough presumed due to GERD do not report improvement despite aggressive acid-suppressive therapy. Some of these refractory cases may be due to the recently appreciated entity of non-acid or weakly acidic reflux. Further contributing to the controversy are recent studies demonstrating that patients with chronic cough do not have excessive reflux events relative to healthy volunteers. Although a temporal relationship between cough and reflux events has been suggested by studies utilizing impedance-pH monitoring of reflux events and objective cough recording, consensus is lacking in terms of whether this temporal relationship proves a causal link between reflux and cough. The 4th American Cough Conference, held in New York in June, 2013, provided an ideal forum for the debate of this issue between two internationally recognized experts in the field of reflux and chronic cough. PMID:24221340

  2. Physiological reactivity to nonideographic virtual reality stimuli in veterans with and without PTSD

    PubMed Central

    Webb, Andrea K; Vincent, Ashley L; Jin, Alvin B; Pollack, Mark H

    2015-01-01

    Background Post-traumatic stress disorder (PTSD) currently is diagnosed via clinical interview in which subjective self reports of traumatic events and associated experiences are discussed with a mental health professional. The reliability and validity of diagnoses can be improved with the use of objective physiological measures. Methods In this study, physiological activity was recorded from 58 male veterans (PTSD Diagnosis n = 16; Trauma Exposed/No PTSD Diagnosis: n = 23; No Trauma/No PTSD Diagnosis: n = 19) with and without PTSD and combat trauma exposure in response to emotionally evocative non-idiographic virtual reality stimuli. Results Statistically significant differences among the Control, Trauma, and PTSD groups were present during the viewing of two virtual reality videos. Skin conductance and interbeat interval features were extracted for each of ten video events (five events of increasing severity per video). These features were submitted to three stepwise discriminant function analyses to assess classification accuracy for Control versus Trauma, Control versus PTSD, and Trauma versus PTSD pairings of participant groups. Leave-one-out cross-validation classification accuracy was between 71 and 94%. Conclusions These results are promising and suggest the utility of objective physiological measures in assisting with PTSD diagnosis. PMID:25642387

  3. PURPA and Photovoltaics: A Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaim, T.

    On May 16, 1983, the U.S. Supreme Court struck down the last major challenge to the Public Utility Regulatory Policies Act (PURPA) and its implementing regulations. In so doing, the Supreme Court upheld the right of photovoltaic and other qualifying investors to interconnect with electric utilities and to sell power at rates equal to the utility's full avoided cost. To appreciate the significance of this event, for U.S. markets, it is necessary to review the recent five-year history of PURPA-related events.

  4. Violence-related Versus Terror-related Stabbings: Significant Differences in Injury Characteristics.

    PubMed

    Rozenfeld, Michael; Givon, Adi; Peleg, Kobi

    2018-05-01

    To demonstrate the gap between injury epidemiology of terror-related stabbings (TRS) and non-terror-related intentional stabbings. Terror attacks with sharp instruments have multiplied recently, with many victims of these incidents presented to hospitals with penetrating injuries. Because most practical experience of surgeons with intentional stabbing injuries comes from treating victims of interpersonal violence, potential gaps in knowledge may exist if injuries from TRS significantly differ from interpersonal stabbings (IPS). A retrospective study of 1615 patients from intentional stabbing events recorded in the Israeli National Trauma Registry during the period of "Knife Intifada" (January 2013-March 2016). All stabbings were divided into TRS and IPS. The 2 categories were compared in terms of sustained injuries, utilization of hospital resources, and clinical outcomes. TRS patients were older, comprised more females and were ethnically homogenous. Most IPS incidents happened on weekdays and at night hours, whereas TRS events peaked midweek during morning and afternoon hours. TRS patients had more injuries of head, face, and neck, and severe head and neck injuries. IPS patients had more abdomen injuries; however, respective injuries in the TRS group were more severe. Greater injury severity of the TRS patients reflected on their higher hospital resources utilization and greater in-hospital mortality. Victims of terror stabbings are profoundly different in their characteristics, sustain injuries of a different profile and greater severity, require more hospital resources, and have worse off clinical outcomes, emphasizing the need of the healthcare systems to adjust itself appropriately to deal successfully with future terror attacks.

  5. Evaluating the Performance of the IEEE Standard 1366 Method for Identifying Major Event Days

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; LaCommare, Kristina Hamachi; Sohn, Michael D.

    IEEE Standard 1366 offers a method for segmenting reliability performance data to isolate the effects of major events from the underlying year-to-year trends in reliability. Recent analysis by the IEEE Distribution Reliability Working Group (DRWG) has found that reliability performance of some utilities differs from the expectations that helped guide the development of the Standard 1366 method. This paper proposes quantitative metrics to evaluate the performance of the Standard 1366 method in identifying major events and in reducing year-to-year variability in utility reliability. The metrics are applied to a large sample of utility-reported reliability data to assess performance of themore » method with alternative specifications that have been considered by the DRWG. We find that none of the alternatives perform uniformly 'better' than the current Standard 1366 method. That is, none of the modifications uniformly lowers the year-to-year variability in System Average Interruption Duration Index without major events. Instead, for any given alternative, while it may lower the value of this metric for some utilities, it also increases it for other utilities (sometimes dramatically). Thus, we illustrate some of the trade-offs that must be considered in using the Standard 1366 method and highlight the usefulness of the metrics we have proposed in conducting these evaluations.« less

  6. Wireless event-recording device with identification codes

    NASA Technical Reports Server (NTRS)

    Watters, David G. (Inventor); Huestis, David L. (Inventor); Bahr, Alfred J. (Inventor)

    2004-01-01

    A wireless recording device can be interrogated to determine its identity and its state. The state indicates whether a particular physical or chemical event has taken place. In effect, the physical or chemical event is recorded by the device. The identity of the device allows it to be distinguished from a number of similar devices. Thus the sensor device may be used in an array of devices that can be probed by a wireless interrogation unit. The device tells the interrogator who it is and what state it is in. The interrogator can thus easily identify particular items in an array that have reached a particular condition.

  7. Increasing use of high-speed digital imagery as a measurement tool on test and evaluation ranges

    NASA Astrophysics Data System (ADS)

    Haddleton, Graham P.

    2001-04-01

    In military research and development or testing there are various fast and dangerous events that need to be recorded and analysed. High-speed cameras allow the capture of movement too fast to be recognised by the human eye, and provide data that is essential for the analysis and evaluation of such events. High-speed photography is often the only type of instrumentation that can be used to record the parameters demanded by our customers. I will show examples where this applied cinematography is used not only to provide a visual record of events, but also as an essential measurement tool.

  8. First Quarter Hanford Seismic Report for Fiscal Year 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.

    2010-03-29

    The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 81 local earthquakes during the first quarter of FY 2010. Sixty-five of these earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this quarter is a continuation of the swarm events observed during fiscal year 2009 and reported in previous quarterly and annual reports (Rohay et al; 2009a, 2009b,more » 2009c, and 2009d). Most of the events were considered minor (coda-length magnitude [Mc] less than 1.0) with only 1 event in the 2.0-3.0 range; the maximum magnitude event (2.5 Mc) occurred on December 22 at depth 2.1 km. The average depth of the Wooded Island events during the quarter was 1.4 km with a maximum depth estimated at 3.1 km. This placed the Wooded Island events within the Columbia River Basalt Group (CRBG). The low magnitude of the Wooded Island events has made them undetectable to all but local area residents. The Hanford SMA network was triggered several times by these events and the SMA recordings are discussed in section 6.0. During the last year some Hanford employees working within a few miles of the swarm area and individuals living directly across the Columbia River from the swarm center have reported feeling many of the larger magnitude events. Strong motion accelerometer (SMA) units installed directly above the swarm area at ground surface measured peak ground accelerations approaching 15% g, the largest values recorded at Hanford. This corresponds to strong shaking of the ground, consistent with what people in the local area have reported. However, the duration and magnitude of these swarm events should not result in any structural damage to facilities. The USGS performed a geophysical survey using satellite interferometry that detected approximately 1 inch uplift in surface deformation along an east-west transect within the swarm area. The uplift is thought to be caused by the release of pressure that has built up in sedimentary layers, cracking the brittle basalt layers with the Columbia River Basalt Formation (CRBG) and causing the earthquakes. Similar earthquake swarms have been recorded near this same location in 1970, 1975 and 1988 but not with SMA readings or satellite imagery. Prior to the 1970s, swarming may have occurred, but equipment was not in place to record those events. The Wooded Island swarm, due its location and the limited magnitude of the events, does not appear to pose any significant risk to Hanford waste storage facilities. Since swarms of the past did not intensify in magnitude, seismologists do not expect that these events will persist or increase in intensity. However, Pacific Northwest National Laboratory (PNNL) will continue to monitor the activity. Outside of the Wooded Island swarm, sixteen earthquakes were recorded, all minor events. Seven earthquakes were located at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments and nine earthquakes at depths greater than 9 km, within the basement. Geographically, seven earthquakes were located in known swarm areas and nine earthquakes were classified as random events.« less

  9. Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.

    PubMed

    Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng

    2016-12-08

    This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.

  10. The nature, use and problems of historical archives for the temporal occurrence of landslides, with specific reference to the south coast of Britain, Ventnor, Isle of Wight

    NASA Astrophysics Data System (ADS)

    Ibsen, Maïa-Laura; Brunsden, Denys

    1996-04-01

    The purpose of this paper is to describe and evaluate the nature of the European historical archives which are suitable for the assessment of the temporal occurrence and forecasting within landslides studies, using the British south coast as an example. The paper is based upon the British contribution to the Environment programme EPOCH, 1991-1993. A primary requirement of a research programme on process occurrence is to determine the event frequencies on as many time and space scales as possible. Thus, the analysis of archives is, potentially, an essential preliminary to the study of the temporal occurrence of landslide events. The range of such data sources extends from isolated, fortuitously dated sites from the Quaternary assemblage, through inferred event impacts using dendrochronology or lichenometric time series to historical records of causal factors such as rainfall data and more recently, deliberately recorded packages of cumulative or continuous data. Most countries have extensive historical sources which may be of considerable value in establishing the characteristics of geomorphological processes. These include narrative in literature, prints and other artwork, terrestrial and aerial photographs, remote sensing series, newspapers, incidental statements and scientific journals and reports. These are numerous difficulties in accessing, extracting, organising, databasing and analysing such data because they are not usually collated for scientific use. Problems involve such incalculable errors as: the experience, training and conscientiousness of the observer; the editing and recording process; judging the validity of the data used and the haphazard nature of recorded events in time and space. Despite these difficulties, such data do yield a record which adds to the representative temporal sample as a level above some threshold reporting position. It therefore has potential for specific statistical analysis. An example of a reasonable temporal landslide record is the data base of the Ventnor complex on the Isle of Wight initially established in 1991 by Geomorphological Services Limited (GSL), now of Rendel Geotechnics, and supplemented by the collections of the first author. The record displays an increase in landslide events over the present century, due probably to increasing technology and awareness of hazard and the development of process geomorphology. However, the landslide record was subsequently correlated with the Ventnor precipitation series. This indicated that wet year sequences usually gave rise to significant landslide events. The increasing variability and number of rainfall events predicted by various climatic units, e.g. the Hadley Centre, may therefore indicate a fundamental increase in landslide events in the future.

  11. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    NASA Astrophysics Data System (ADS)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.

  12. Association of single nucleotide polymorphisms in a glutamate receptor gene (GRM8) with theta power of event-related oscillations and alcohol dependence.

    PubMed

    Chen, Andrew C H; Tang, Yongqiang; Rangaswamy, Madhavi; Wang, Jen C; Almasy, Laura; Foroud, Tatiana; Edenberg, Howard J; Hesselbrock, Victor; Nurnberger, John; Kuperman, Samuel; O'Connor, Sean J; Schuckit, Marc A; Bauer, Lance O; Tischfield, Jay; Rice, John P; Bierut, Laura; Goate, Alison; Porjesz, Bernice

    2009-04-05

    Evidence suggests the P3 amplitude of the event-related potential and its underlying superimposed event-related oscillations (EROs), primarily in the theta (4-5 Hz) and delta (1-3 Hz) frequencies, as endophenotypes for the risk of alcoholism and other disinhibitory disorders. Major neurochemical substrates contributing to theta and delta rhythms and P3 involve strong GABAergic, cholinergic and glutamatergic system interactions. The aim of this study was to test the potential associations between single nucleotide polymorphisms (SNPs) in glutamate receptor genes and ERO quantitative traits. GRM8 was selected because it maps at chromosome 7q31.3-q32.1 under the peak region where we previously identified significant linkage (peak LOD = 3.5) using a genome-wide linkage scan of the same phenotype (event-related theta band for the target visual stimuli). Neural activities recorded from scalp electrodes during a visual oddball task in which rare target elicited P3s were analyzed in a subset of the Collaborative Study on the Genetics of Alcoholism (COGA) sample comprising 1,049 Caucasian subjects from 209 families (with 472 DSM-IV alcohol dependent individuals). The family-based association test (FBAT) detected significant association (P < 0.05) with multiple SNPs in the GRM8 gene and event-related theta power to target visual stimuli, and also with alcohol dependence, even after correction for multiple comparisons by false discovery rate (FDR). Our results suggest that variation in GRM8 may be involved in modulating event-related theta oscillations during information processing and also in vulnerability to alcoholism. These findings underscore the utility of electrophysiology and the endophenotype approach in the genetic study of psychiatric disorders. (c) 2008 Wiley-Liss, Inc.

  13. 42 CFR 456.212 - Records and reports.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Records and reports. 456.212 Section 456.212 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Mental Hospitals Ur Plan...

  14. 18 CFR 125.3 - Schedule of records and periods of retention.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and agreements. 4. Accountants' and auditors' reports. Information Technology Management 5. Automatic.... Vouchers. Insurance 12. Insurance records. Operations and Maintenance 13.1. Production—Public utilities and licensees (less nuclear). 13.2 Production—Nuclear. 14. Transmission and distribution—Public utilities and...

  15. The Feasibility of the Nationwide Health Information Network.

    PubMed

    Valle, Jazmine; Gomes, Christian; Godby, Tyler; Coustasse, Alberto

    2016-01-01

    The Nationwide Health Information Network (NHIN) use in health care facilities was examined for utilization and efficacy; although the advantages are abundant, health care facilities have been reluctant to adopt it because of associated costs. The purpose of this study was to analyze the feasibility of a US NHIN by exploring and determining the benefits of an NHIN and assessing the barriers to its implementation. The research methodology applied in examining the implementation of NHIN in the United States was a qualitative literature review, which followed the basic guidelines of a systematic literature review, partnered with a semistructured interview of a chief information officer of a private, nonprofit, 193-bed hospital located in Westminster, Maryland. A total of 33 sources were referenced. The results of this study suggest that implementation and utilization of NHIN by health care industry stakeholders lead to an increased quality of patient care, increased patient-provider communication, and cost-savings opportunities. Increased quality of care is achieved by reducing adverse drug events and medical errors. Cost-savings opportunities are generated by a reduction in spending and prices that is attributable to electronic health record systems' increased efficiency and effectiveness. Nevertheless, barriers to NHIN implementation and utilization still remain throughout the health care industry, the main one being concerns about interoperability.

  16. Characterizing the Kathmandu Valley sediment response through strong motion recordings of the 2015 Gorkha earthquake sequence

    USGS Publications Warehouse

    Rajaure, S.; Asimaki, Domniki; Thompson, Eric M.; Hough, Susan E.; Martin, Stacey; Ampuero, J.P.; Dhital, M.R.; Inbal, A; Takai, N; Shigefuji, M.; Bijukchhen, S; Ichiyanagi, M; Sasatani, T; Paudel, L

    2017-01-01

    We analyze strong motion records and high-rate GPS measurements of the M 7.8 Gorkha mainshock, M 7.3 Dolakha, and two moderate aftershock events recorded at four stations on the Kathmandu basin sediments, and one on rock-outcrop. Recordings on soil from all four events show systematic amplification relative to the rock site at multiple frequencies in the 0.1–2.5 Hz frequency range, and de-amplification of higher frequencies ( >2.5–10 Hz). The soil-to-rock amplification ratios for the M 7.8 and M 7.3 events have lower amplitude and frequency peaks relative to the ratios of the two moderate events, effects that could be suggestive of nonlinear site response. Further, comparisons to ground motion prediction equations show that 1) both soil and rock mainshock recordings were severely depleted of high frequencies, and 2) the depletion at high frequencies is not present in the aftershocks. These observations indicate that the high frequency deamplification is additionally related to characteristics of the source that are not captured by simplified ground motion prediction equations, and allude to seismic hazard analysis models being revised – possibly by treating isolated high frequency radiation sources separately from long period components to capture large magnitude near-source events such as the 2015 Gorkha mainshock.

  17. Long-term records reveal decoupling of nitrogen and phosphorus cycles in a large, urban lake in response to an extreme rainfall event

    NASA Astrophysics Data System (ADS)

    Corman, J. R.; Loken, L. C.; Oliver, S. K.; Collins, S.; Butitta, V.; Stanley, E. H.

    2017-12-01

    Extreme events can play powerful roles in shifting ecosystem processes. In lakes, heavy rainfall can transport large amounts of particulates and dissolved nutrients into the water column and, potentially, alter biogeochemical cycling. However, the impacts of extreme rainfall events are often difficult to study due to a lack of long-term records. In this paper, we combine daily discharge records with long-term lake water quality information collected by the North Temperate Lakes Long-Term Ecological Research (NTL LTER) site to investigate the impacts of extreme events on nutrient cycling in lakes. We focus on Lake Mendota, an urban lake within the Yahara River Watershed in Madison, Wisconsin, USA, where nutrient data are available at least seasonally from 1995 - present. In June 2008, precipitation amounts in the Yahara watershed were 400% above normal values, triggering the largest discharge event on record for the 40 years of monitoring at the streamgage station; hence, we are able to compare water quality records before and after this event as a case study of how extreme rain events couple or decouple lake nutrient cycling. Following the extreme event, the lake-wide mass of nitrogen and phosphorus increased in the summer of 2008 by 35% and 21%, respectively, shifting lake stoichiometry by increasing N:P ratios (Figure 1). Nitrogen concentrations remained elevated longer than phosphorus, suggesting (1) that nitrogen inputs into the lake were sustained longer than phosphorus (i.e., a "smear" versus "pulse" loading of nitrogen versus phosphorus, respectively, in response to the extreme event) and/or (2) that in-lake biogeochemical processing was more efficient at removing phosphorus compared to nitrogen. While groundwater loading data are currently unavailable to test the former hypothesis, preliminary data from surficial nitrogen and phosphorus loading to Lake Mendota (available for 2011 - 2013) suggest that nitrogen removal efficiency is less than phosphorus, supporting the latter hypothesis. As climate change is expected to increase the frequency of extreme events, continued monitoring of lakes is needed to understand biogeochemical responses and when and how water quality threats may occur.

  18. Analysis of the geophysical data using a posteriori algorithms

    NASA Astrophysics Data System (ADS)

    Voskoboynikova, Gyulnara; Khairetdinov, Marat

    2016-04-01

    The problems of monitoring, prediction and prevention of extraordinary natural and technogenic events are priority of modern problems. These events include earthquakes, volcanic eruptions, the lunar-solar tides, landslides, falling celestial bodies, explosions utilized stockpiles of ammunition, numerous quarry explosion in open coal mines, provoking technogenic earthquakes. Monitoring is based on a number of successive stages, which include remote registration of the events responses, measurement of the main parameters as arrival times of seismic waves or the original waveforms. At the final stage the inverse problems associated with determining the geographic location and time of the registration event are solving. Therefore, improving the accuracy of the parameters estimation of the original records in the high noise is an important problem. As is known, the main measurement errors arise due to the influence of external noise, the difference between the real and model structures of the medium, imprecision of the time definition in the events epicenter, the instrumental errors. Therefore, posteriori algorithms more accurate in comparison with known algorithms are proposed and investigated. They are based on a combination of discrete optimization method and fractal approach for joint detection and estimation of the arrival times in the quasi-periodic waveforms sequence in problems of geophysical monitoring with improved accuracy. Existing today, alternative approaches to solving these problems does not provide the given accuracy. The proposed algorithms are considered for the tasks of vibration sounding of the Earth in times of lunar and solar tides, and for the problem of monitoring of the borehole seismic source location in trade drilling.

  19. Grain-size analysis and sediment dynamics of hurricane-induced event beds in a coastal New England pond

    NASA Astrophysics Data System (ADS)

    Castagno, K. A.; Ruehr, S. A.; Donnelly, J. P.; Woodruff, J. D.

    2017-12-01

    Coastal populations have grown increasingly susceptible to the impacts of tropical cyclone events as they grow in size, wealth, and infrastructure. Changes in tropical cyclone frequency and intensity, augmented by a changing climate, pose an increasing threat of property damage and loss of life. Reconstructions of intense-hurricane landfalls from a series of southeastern New England sediment cores identify a series of events spanning the past 2,000 years. Though the frequency of these landfalls is well constrained, the intensity of these storms, particularly those for which no historical record exists, is not. This study analyzes the grain-size distribution of major storm event beds along a transect of sediment cores from a kettle pond in Falmouth, MA. The grain-size distribution of each event is determined using an image processing, size, and shape analyzer. The depositional patterns and changes in grain-size distribution in these fine-grained systems may both spatially and temporally reveal characteristics of both storm intensity and the nature of sediment deposition. An inverse-modeling technique using this kind of grain-size analysis to determine past storm intensity has been explored in back-barrier lagoon systems in the Caribbean, but limited research has assessed its utility to assess deposits from back-barrier ponds in the northeastern United States. Increases in hurricane intensity may be closely tied to increases in sea surface temperature. As such, research into these prehistoric intervals of increased frequency and/or intensity provides important insight into the current and future hurricane risks facing coastal communities in New England.

  20. Towards evidence-based management: creating an informative database of nursing-sensitive indicators.

    PubMed

    Patrician, Patricia A; Loan, Lori; McCarthy, Mary; Brosch, Laura R; Davey, Kimberly S

    2010-12-01

    The purpose of this paper is to describe the creation, evolution, and implementation of a database of nursing-sensitive and potentially nursing-sensitive indicators, the Military Nursing Outcomes Database (MilNOD). It discusses data quality, utility, and lessons learned. Prospective data collected each shift include direct staff hours by levels (i.e., registered nurse, other licensed and unlicensed providers), staff categories (i.e., military, civilian, contract, and reservist), patient census, acuity, and admissions, discharges, and transfers. Retrospective adverse event data (falls, medication errors, and needle-stick injuries) were collected from existing records. Annual patient satisfaction, nurse work environment, and pressure ulcer and restraint prevalence surveys were conducted. The MilNOD contains shift level data from 56 units in 13 military hospitals and is used to target areas for managerial and clinical performance improvement. This methodology can be modified for use in other healthcare systems. As standard tools for evidence-based management, databases such as MilNOD allow nurse leaders to track the status of nursing and adverse events in their facilities. No claim to original US government works.

  1. Integration of Laser Scanning and Three-dimensional Models in the Legal Process Following an Industrial Accident.

    PubMed

    Eyre, Matthew; Foster, Patrick; Speake, Georgina; Coggan, John

    2017-09-01

    In order to obtain a deeper understanding of an incident, it needs to be investigated to "peel back the layers" and examine both immediate and underlying failures that contributed to the event itself. One of the key elements of an effective accident investigation is recording the scene for future reference. In recent years, however, there have been major advances in survey technology, which have provided the ability to capture scenes in three dimension to an unprecedented level of detail, using laser scanners. A case study involving a fatal incident was surveyed using three-dimensional laser scanning, and subsequently recreated through virtual and physical models. The created models were then utilized in both accident investigation and legal process, to explore the technologies used in this setting. Benefits include explanation of the event and environment, incident reconstruction, preservation of evidence, reducing the need for site visits, and testing of theories. Drawbacks include limited technology within courtrooms, confusion caused by models, cost, and personal interpretation and acceptance in the data. Laser scanning surveys can be of considerable use in jury trials, for example, in case the location supports the use of a high-definition survey, or an object has to be altered after the accident and it has a specific influence on the case and needs to be recorded. However, consideration has to be made in its application and to ensure a fair trial, with emphasis being placed on the facts of the case and personal interpretation controlled.

  2. Climatological determinants of woody cover in Africa.

    PubMed

    Good, Stephen P; Caylor, Kelly K

    2011-03-22

    Determining the factors that influence the distribution of woody vegetation cover and resolving the sensitivity of woody vegetation cover to shifts in environmental forcing are critical steps necessary to predict continental-scale responses of dryland ecosystems to climate change. We use a 6-year satellite data record of fractional woody vegetation cover and an 11-year daily precipitation record to investigate the climatological controls on woody vegetation cover across the African continent. We find that-as opposed to a relationship with only mean annual rainfall-the upper limit of fractional woody vegetation cover is strongly influenced by both the quantity and intensity of rainfall events. Using a set of statistics derived from the seasonal distribution of rainfall, we show that areas with similar seasonal rainfall totals have higher fractional woody cover if the local rainfall climatology consists of frequent, less intense precipitation events. Based on these observations, we develop a generalized response surface between rainfall climatology and maximum woody vegetation cover across the African continent. The normalized local gradient of this response surface is used as an estimator of ecosystem vegetation sensitivity to climatological variation. A comparison between predicted climate sensitivity patterns and observed shifts in both rainfall and vegetation during 2009 reveals both the importance of rainfall climatology in governing how ecosystems respond to interannual fluctuations in climate and the utility of our framework as a means to forecast continental-scale patterns of vegetation shifts in response to future climate change.

  3. Tracking Temporal Hazard in the Human Electroencephalogram Using a Forward Encoding Model

    PubMed Central

    2018-01-01

    Abstract Human observers automatically extract temporal contingencies from the environment and predict the onset of future events. Temporal predictions are modeled by the hazard function, which describes the instantaneous probability for an event to occur given it has not occurred yet. Here, we tackle the question of whether and how the human brain tracks continuous temporal hazard on a moment-to-moment basis, and how flexibly it adjusts to strictly implicit variations in the hazard function. We applied an encoding-model approach to human electroencephalographic data recorded during a pitch-discrimination task, in which we implicitly manipulated temporal predictability of the target tones by varying the interval between cue and target tone (i.e. the foreperiod). Critically, temporal predictability either was driven solely by the passage of time (resulting in a monotonic hazard function) or was modulated to increase at intermediate foreperiods (resulting in a modulated hazard function with a peak at the intermediate foreperiod). Forward-encoding models trained to predict the recorded EEG signal from different temporal hazard functions were able to distinguish between experimental conditions, showing that implicit variations of temporal hazard bear tractable signatures in the human electroencephalogram. Notably, this tracking signal was reconstructed best from the supplementary motor area, underlining this area’s link to cognitive processing of time. Our results underline the relevance of temporal hazard to cognitive processing and show that the predictive accuracy of the encoding-model approach can be utilized to track abstract time-resolved stimuli. PMID:29740594

  4. Increasing frequency and duration of Arctic winter warming events

    NASA Astrophysics Data System (ADS)

    Graham, R. M.; Cohen, L.; Petty, A.; Boisvert, L.; Rinke, A.; Hudson, S. R.; Nicolaus, M.; Granskog, M. A.

    2017-12-01

    Record low Arctic sea ice extents were observed during the last three winter seasons (March). During each of these winters, near-surface air temperatures close to 0°C were observed, in situ, over sea ice in the central Arctic. Recent media reports and scientific studies suggest that such winter warming events were unprecedented for the Arctic. Here we use in situ winter (December-March) temperature observations, such as those from Soviet North Pole drifting stations and ocean buoys, to determine how common Arctic winter warming events are. The earliest record we find of a winter warming event was in March 1896, where a temperature of -3.7˚C was observed at 84˚N during the Fram expedition. Observations of winter warming events exist over most of the Arctic Basin. Despite a limited observational network, temperatures exceeding -5°C were measured in situ during more than 30% of winters from 1954 to 2010, by either North Pole drifting stations or ocean buoys. Correlation coefficients between the atmospheric reanalysis, ERA-Interim, and these in-situ temperature records are shown to be on the order of 0.90. This suggests that ERA-Interim is a suitable tool for studying Arctic winter warming events. Using the ERA-Interim record (1979-2016), we show that the North Pole (NP) region typically experiences 10 warming events (T2m > -10°C) per winter, compared with only five in the Pacific Central Arctic (PCA). We find a positive trend in the overall duration of winter warming events for both the NP region (4.25 days/decade) and PCA (1.16 days/decade), due to an increased number of events of longer duration.

  5. High-Resolution Modeling of ENSO-Induced Precipitation in the Tropical Andes: Implications for Proxy Interpretation.

    NASA Astrophysics Data System (ADS)

    Kiefer, J.; Karamperidou, C.

    2017-12-01

    Clastic sediment flux into high-elevation Andean lakes is controlled by glacial processes and soil erosion caused by high precipitation events, making these lakes suitable archives of past climate. To wit, sediment records from Laguna Pallcacocha in Ecuador have been interpreted as proxies of ENSO variability, owing to increased precipitation in the greater region during El Niño events. However, the location of the lake's watershed, the presence of glaciers, and the different impacts of ENSO on precipitation in the eastern vs western Andes have challenged the suitability of the Pallcacocha record as an ENSO proxy. Here, we employ WRF, a high-resolution regional mesoscale weather prediction model, to investigate the circulation dynamics, sources of moisture, and resulting precipitation response in the L. Pallcacocha region during different flavors of El Niño and La Niña events, and in the presence or absence of ice caps. In patricular, we investigate Eastern Pacific (EP), Central Pacific (CP), coastal El Niño, and La Niña events. We validate the model simulations against spatially interpolated station measurements and reanalysis data. We find that during EP events, moisture is primarily advected from the Pacific, whereas during CP events, moisture primarily originates from the Atlantic. More moisture is available during EP events, which implies higher precipitation rates. Furthermore, we find that precipitation during EP events is mostly non-convective in contrast to primarily convective precipitation during CP events. Finally, a synthesis of the sedimentary record and the EP:CP ratio of accumulated precipitation and specific humidity in the L. Pallcacocha region allows us to assess whether past changes in the relative frequency of the two ENSO flavors may have been recorded in paleoclimate archives in this region.

  6. Paroxysmal non-epileptic events in infants and toddlers: A phenomenologic analysis.

    PubMed

    Chen, Li; Knight, Elia M Pestana; Tuxhorn, Ingrid; Shahid, Asim; Lüders, Hans O

    2015-06-01

    The aim of this study was to analyze in detail the clinical phenomenology of paroxysmal non-epileptic events (PNEE) in infants and toddlers. We studied all children aged ≤2 years who were diagnosed with PNEE based on video-electroencephalographic (VEEG) recordings. We analyzed the following four clinical domains of each clinical event: (i) motor manifestations (body/limb jerking, complex motor, and asymmetric limb posturing); (ii) oral/vocal (crying, vocalization, sighing); (iii) behavioral change (arrest of activity, staring); (iv) and autonomic (facial flushing, breath holding). Thirty-one of 81 (38.3%) infants and toddlers had 38 PNEE recorded during the study period (12 girls and 19 boys, mean age 10.5 months). The predominant clinical features were as follows: motor in 26/38 events, oral/verbal in 14/38 events, behavioral in 11/38 events, and autonomic in 8/38 events. Epileptic seizures and PNEE coexisted in four children (12.9%). Seventeen children (54.8%) had one or more risk factors suggestive of epilepsy. Twelve children (38.7%) had a normal neurologic examination, 10 (32.3%) had developmental delay, and eight (25.8%) had a family history of epilepsy or seizures. VEEG recorded PNEE in nearly 40% of 81 infants and toddlers referred for unclear paroxysmal events in our cohort. Non-epileptic staring spells and benign sleep myoclonus were the most common events recorded, followed by shuddering attacks and infantile masturbation. In addition, greater than one-half of the infants and toddlers had risk factors, raising a concern for epilepsy in the family and prompting the VEEG evaluation, suggesting that paroxysmal non-epileptic seizures may frequently coexist in young children with epilepsy. © 2014 The Authors. Psychiatry and Clinical Neurosciences © 2014 Japanese Society of Psychiatry and Neurology.

  7. A Comparison of Reliability Measures for Continuous and Discontinuous Recording Methods: Inflated Agreement Scores with Partial Interval Recording and Momentary Time Sampling for Duration Events

    ERIC Educational Resources Information Center

    Rapp, John T.; Carroll, Regina A.; Stangeland, Lindsay; Swanson, Greg; Higgins, William J.

    2011-01-01

    The authors evaluated the extent to which interobserver agreement (IOA) scores, using the block-by-block method for events scored with continuous duration recording (CDR), were higher when the data from the same sessions were converted to discontinuous methods. Sessions with IOA scores of 89% or less with CDR were rescored using 10-s partial…

  8. Access and Use: Improving Digital Multimedia Consumer Health Information.

    PubMed

    Thomas, Alex

    2016-01-01

    This project enabled novel organisational insight into the comparative utility of a portfolio of consumer health information content, by measuring patterns of attrition (abandonment) in content use. The project used as a case study the event activity log of a fully automated digital information kiosk, located in a community health facility. Direct measurements of the duration of content use were derived from the user interface activity recorded in the kiosk log, thus avoiding issues in using other approaches to collecting this type of data, such as sampling and observer bias. The distribution patterns of 1,383 durations of observed abandonments of use for twenty-eight discrete modules of health information content were visualised using Kaplan-Meir survival plots. Clear patterns of abandonment of content use were exhibited. The method of analysis is cost-effective, scalable and provides deep insight into the utility of health promotion content. The impact on the content producers, platform operators and service users is to improve organisational learning and thus increase the confidence in stakeholders that the service is continuously delivering high quality health and wellbeing benefits.

  9. Perceived ownership impacts reward evaluation within medial-frontal cortex.

    PubMed

    Krigolson, Olave E; Hassall, Cameron D; Balcom, Lynsey; Turk, David

    2013-06-01

    Ownership is a powerful construct. Indeed, in a series of recent studies, perceived ownership has been shown to increase attentional capacity, facilitate a memorial advantage, and elicit positive attitudes. Here, we sought to determine whether self-relevance would bias reward evaluation systems within the brain. To accomplish this, we had participants complete a simple gambling task during which they could "win" or "lose" prizes for themselves or for someone else, while electroencephalographic data were recorded. Our results indicated that the amplitude of the feedback error-related negativity, a component of the event-related brain potential sensitive to reward evaluation, was diminished when participants were not gambling for themselves. Furthermore, our data suggest that the ownership cues that indicated who would win or lose a given gamble either were processed as a potential for an increase in utility (i.e., gain: self-gambles) or were processed in a nonutilitarian manner (other-gambles). Importantly, our results suggest that the medial-frontal reward system is sensitive to perceived ownership, to the extent that it may not process changes in utility when they are not directly relevant to self.

  10. Cognitive complexity of the medical record is a risk factor for major adverse events.

    PubMed

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.

  11. Extreme event statistics in a drifting Markov chain

    NASA Astrophysics Data System (ADS)

    Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur

    2017-07-01

    We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.

  12. 49 CFR 563.11 - Information in owner's manual.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... law enforcement, could combine the EDR data with the type of personally identifying data routinely... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.11 Information in owner's... statement in English: This vehicle is equipped with an event data recorder (EDR). The main purpose of an EDR...

  13. 49 CFR 563.11 - Information in owner's manual.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... law enforcement, could combine the EDR data with the type of personally identifying data routinely... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.11 Information in owner's... statement in English: This vehicle is equipped with an event data recorder (EDR). The main purpose of an EDR...

  14. Development of Advanced Propagation Models and Application to the Study of Impulsive Infrasonic Events

    DTIC Science & Technology

    2007-09-01

    waveforms recorded at St. George, Utah, from the Texarkana event. Figure 6. Recorded infrasound waveforms at one of the SGAR array elements...along with its spectrogram, from the Texarkana underground nuclear explosion of February 10, 1989. Preliminary Analysis of Waveform Parameters Related

  15. 49 CFR 563.9 - Data capture.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Data capture. 563.9 Section 563.9 Transportation..., DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.9 Data capture. The EDR must capture and record the data elements for events in accordance with the following conditions and circumstances: (a) In a...

  16. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  17. Geochemical Records of Bleaching Events and the Associated Stressors From the Great Barrier Reef

    NASA Astrophysics Data System (ADS)

    Roark, E. B.; McCulloch, M.; Ingram, B. L.; Marshall, J. F.

    2003-12-01

    The health of coral reefs world-wide is increasingly threatened by a wide array of stressors. On the Great Barrier Reef (GBR) these stressors include increased sediment flux associated with land use changes, increased sea surface temperatures (SST) and salinity changes due to large floods, the latter two of which are factors in an increased number of bleaching events. The ability to document long-term change in these stressors along with changes in the number of bleaching events would help discern what are natural and anthropogenic changes in this ecosystem. Here we present results of an initial calibration effort aimed at identifying bleaching events and the associated stressors using stable isotopic and trace element analysis in coral cores. Three ˜15-year time series of geochemical measurements (δ 13C, δ 18O, and Sr/Ca) on Porites coral cores obtained from Pandora Reef and the Keppel Islands on the GBR have been developed at near weekly resolution. Since the δ 13C of the coral skeletal carbonate is known to be affected by both environmental factors (e.g. insolation and temperature) and physiological factors (e.g. photosynthesis, calcification, and the statues of the symbiotic relationship between corals and zooxanthellae) it is the most promising proxy for reconstructing past bleaching events. The first record (PAN-98) comes from a coral head that had undergone bleaching and died shortly after the large-scale bleaching events on Pandora Reef in 1998. A second core (PAN-02) was collected from a living coral within 10m of PAN-98 in 2002. Sr/Ca ratios in both cores tracked even the smallest details of an in situ SST record. The increase in SST that occurred three to four weeks prior to bleaching was faithfully recorded by a similar decrease in the Sr/Ca ratio in PAN-98, indicating that calcification continued despite the high SST of 30-31° C. The δ 13C values decreased by about 5‰ , one week after the SST increase, and remained at this value for about 4 weeks until the coral died. In 1994 and 1995, there are decreases in the δ 13C values of 3‰ . In 1994, a flood plume from the Burdekin River reached Pandora Reef and bleaching was reported. In 1995 we note a 4-5 week period of elevated SST based on the Sr/Ca results, which may have been sufficient to cause stress or bleaching of the coral. No clear decreases in δ 13C values associated with any bleaching event was evident in the PAN-02 record, however there is a clear growth hiatus that lasted several months during the 1998 bleaching event. δ 18O results in both records show many of the same details as the Sr/Ca and SST record, suggesting temperature changes as the dominant control. However, during flooding events (1996, 1997, and 1998), the δ 18O values were decreased by increased freshwater input to the reef. The associated salinity changes were determined by subtracting the temperature component from the δ 18O signal using Sr/Ca ratios and compared with the weekly average flow records from the Burdekin River and a Ba/Ca record (McCulloch et al. 2003) of sediment flux to the reef. Similar results were obtained in a third record from the Kepple Islands which included one of the largest floods of the century and a bleaching event in 1991.

  18. Heinrich events and sea level changes: records from uplifted coral terraces and marginal seas

    NASA Astrophysics Data System (ADS)

    Yokoyama, Y.; Esat, T. M.; Suga, H.; Obrochta, S.; Ohkouchi, N.

    2017-12-01

    Repeated major ice discharge events spaced every ca.7,000 years during the last ice age was first detected in deep sea sediments from North Atlantic. Characterized as lithic layers, these Heinrich Events (Heinrich, 1988 QR) correspond to rapid climate changes attributed to weakened ocean circulation (eg., Broecker, 1994 Nature; Alley, 1998 Nature) as shown by a number of different proxies. A better understanding of the overall picture of Heinrich events would benefit from determining the total amount of ice involved each event, which is still under debate. Sea level records are the most direct means for that, and uranium series dated corals can constrain the timing precisely. However, averaged global sea level during the time of interest was around -70m, hindering study from tectonically stable regions. Using uplifted coral terraces that extend 80 km along the Huon Peninsula, Papua New Guinea, the magnitude of sea level change during Heinrich Events was successfully reconstructed (Yokoyama et al., 2001 EPSL; Chappell et al., 1996 EPSL; Cutler et al., 2003). The H3 and H5 events are also well correlated with continuous sea level reconstructions using Red Sea oxygen isotope records (Siddall et al., 2003 Nature; Yokoyama and Esat, 2011Oceanography). Global ice sheet growth after 30 ka complicates interpretation of the Huon Peninsula record. However oxygen isotope data from the Japan Sea, a restricted margin sea with a shallow sill depth similar to the Red Sea, clearly captures the episode of H2 sea level change. The timing of these sea level excursions correlate well to the DSDP Site 609 detrital layers that are anchored in the latest Greenland ice core chronology (Obrochta et al., 2012 QSR). In the presentation, Antarctic ice sheet behavior during the H2 event will also be discussed using marginal seas oxygen records.

  19. External prolonged electrocardiogram monitoring in unexplained syncope and palpitations: results of the SYNARR-Flash study.

    PubMed

    Locati, E T; Moya, A; Oliveira, M; Tanner, H; Willems, R; Lunati, M; Brignole, M

    2016-08-01

    SYNARR-Flash study (Monitoring of SYNcopes and/or sustained palpitations of suspected ARRhythmic origin) is an international, multicentre, observational, prospective trial designed to evaluate the role of external 4-week electrocardiogram (ECG) monitoring in clinical work-up of unexplained syncope and/or sustained palpitations of suspected arrhythmic origin. Consecutive patients were enrolled within 1 month after unexplained syncope or palpitations (index event) after being discharged from emergency room or hospitalization without a conclusive diagnosis. A 4-week ECG monitoring was obtained by external high-capacity loop recorder (SpiderFlash-T(®), Sorin) storing patient-activated and auto-triggered tracings. Diagnostic monitorings included (i) conclusive events with reoccurrence of syncope or palpitation with concomitant ECG recording (with/without arrhythmias) and (ii) events with asymptomatic predefined significant arrhythmias (sustained supraventricular or ventricular tachycardia, advanced atrio-ventricular block, sinus bradycardia <30 b.p.m., pauses >6 s). SYNARR-Flash study enrolled 395 patients (57.7% females, 56.9 ± 18.7 years, 28.1% with syncope, and 71.9% with palpitations) from 10 European centres. For syncope, the 4-week diagnostic yield was 24.5%, and predictors of diagnostic events were early start of recording (0-15 vs. >15 days after index event) (OR 6.2, 95% CI 1.3-29.6, P = 0.021) and previous history of supraventricular arrhythmias (OR 3.6, 95% CI 1.4-9.7, P = 0.018). For palpitations, the 4-week diagnostic yield was 71.6% and predictors of diagnostic events were history of recurrent palpitations (P < 0.001) and early start of recording (P = 0.001). The 4-week external ECG monitoring can be considered as first-line tool in the diagnostic work-up of syncope and palpitation. Early recorder use, history of supraventricular arrhythmia, and frequent previous events increased the likelihood of diagnostic events during the 4-week external ECG monitoring. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Cardiology.

  20. Watershed erosion estimated from a high-resolution sediment core reveals a non-stationary frequency-magnitude relationship and importance of seasonal climate drivers

    NASA Astrophysics Data System (ADS)

    Gavin, D. G.; Colombaroli, D.; Morey, A. E.

    2015-12-01

    The inclusion of paleo-flood events greatly affects estimates of peak magnitudes (e.g., Q100) in flood-frequency analysis. Likewise, peak events also are associated with certain synoptic climatic patterns that vary on all time scales. Geologic records preserved in lake sediments have the potential to capture the non-stationarity in frequency-magnitude relationships, but few such records preserve a continuous history of event magnitudes. We present a 10-meter 2000-yr record from Upper Squaw Lake, Oregon, that contains finely laminated silt layers that reflect landscape erosion events from the 40 km2 watershed. CT-scans of the core (<1 mm resolution) and a 14C-dated chronology yielded a pseudo-annual time series of erosion magnitudes. The most recent 80 years of the record correlates strongly with annual peak stream discharge and road construction. We examined the frequency-magnitude relationship for the entire pre-road period and show that the seven largest events fall above a strongly linear relationship, suggesting a distinct process (e.g., severe fires or earthquakes) operating at low-frequency to generate large-magnitude events. Expressing the record as cumulative sediment accumulation anomalies showed the importance of the large events in "returning the system" to the long-term mean rate. Applying frequency-magnitude analysis in a moving window showed that the Q100 and Q10 of watershed erosion varied by 1.7 and 1.0 orders of magnitude, respectively. The variations in watershed erosion are weakly correlated with temperature and precipitation reconstructions at the decadal to centennial scale. This suggests that dynamics both internal (i.e., sediment production) and external (i.e., earthquakes) to the system, as well as more stochastic events (i.e., single severe wildfires) can at least partially over-ride external climate forcing of watershed erosion at decadal to centennial time scales.

  1. 6-kyr record of flood frequency and intensity in the western Mediterranean Alps - Interplay of solar and temperature forcing

    NASA Astrophysics Data System (ADS)

    Sabatier, Pierre; Wilhelm, Bruno; Ficetola, Gentile Francesco; Moiroux, Fanny; Poulenard, Jérôme; Develle, Anne-Lise; Bichet, Adeline; Chen, Wentao; Pignol, Cécile; Reyss, Jean-Louis; Gielly, Ludovic; Bajard, Manon; Perrette, Yves; Malet, Emmanuel; Taberlet, Pierre; Arnaud, Fabien

    2017-08-01

    The high-resolution sedimentological and geochemical analysis of a sediment sequence from Lake Savine (Western Mediterranean Alps, France) led to the identification of 220 event layers for the last 6000 years. 200 were triggered by flood events and 20 by underwater mass movements possibly related to earthquakes that occurred in 5 clusters of increase seismicity. Because human activity could influence the flood chronicle, the presence of pastures was reconstructed through ancient DNA, which suggested that the flood chronicle was mainly driven by hydroclimate variability. Weather reanalysis of historical floods allow to identify that mesoscale precipitation events called "East Return" events were the main triggers of floods recorded in Lake Savine. The first part of this palaeoflood record (6-4 kyr BP) was characterized by increases in flood frequency and intensity in phase with Northern Alpine palaeoflood records. By contrast, the second part of the record (i.e., since 4 kyr BP) was phased with Southern Alpine palaeoflood records. These results suggest a palaeohydrological transition at approximately 4 kyr BP, as has been previously described for the Mediterranean region. This may have resulted in a change of flood-prone hydro-meteorological processes, i.e., in the balance between occurrence and intensity of local convective climatic phenomena and their influence on Mediterranean mesoscale precipitation events in this part of the Alps. At a centennial timescale, increases in flood frequency and intensity corresponded to periods of solar minima, affecting climate through atmospheric changes in the Euro-Atlantic sector.

  2. Use of a handheld computer application for voluntary medication event reporting by inpatient nurses and physicians.

    PubMed

    Dollarhide, Adrian W; Rutledge, Thomas; Weinger, Matthew B; Dresselhaus, Timothy R

    2008-04-01

    To determine the feasibility of capturing self-reported medication events using a handheld computer-based Medication Event Reporting Tool (MERT). Handheld computers operating the MERT software application were deployed among volunteer physician (n = 185) and nurse (n = 119) participants on the medical wards of four university-affiliated teaching hospitals. Participants were encouraged to complete confidential reports on the handheld computers for medication events observed during the study period. Demographic variables including age, gender, education level, and clinical experience were recorded for all participants. Each MERT report included details on the provider, location, timing and type of medication event recorded. Over the course of 2,311 days of clinician participation, 76 events were reported; the median time for report completion was 231 seconds. The average event reporting rate for all participants was 0.033 reports per clinician shift. Nurses had a significantly higher reporting rate compared to physicians (0.045 vs 0.026 reports/shift, p = .02). Subgroup analysis revealed that attending physicians reported events more frequently than resident physicians (0.042 vs 0.021 reports/shift, p = .03), and at a rate similar to that of nurses (p = .80). Only 5% of MERT medication events were reported to require increased monitoring or treatment. A handheld-based event reporting tool is a feasible method to record medication events in inpatient hospital care units. Handheld reporting tools may hold promise to augment existing hospital reporting systems.

  3. Health care service utilization and associated factors among heroin users in northern Taiwan.

    PubMed

    Chen, Yi-Chih; Chen, Chih-Ken; Lin, Shih-Ku; Chiang, Shu-Chuan; Su, Lien-Wen; Wang, Liang-Jen

    2013-11-01

    Due to the needs of medical care, the probability of using health care service from heroin users is high. This cross-sectional study investigated the frequency and correlates of health service utilization among heroin users. From June to September 2006, 124 heroin users (110 males and 14 females, mean age: 34.2 ± 8.3 years) who entered two psychiatric hospitals (N = 83) and a detention center (N = 41) in northern Taiwan received a face-to-face interview. Therefore, socio-demographic characteristics, patterns of drug use, psychiatric comorbidities, blood-borne infectious diseases and health service utilization were recorded. The behaviors of health service utilization were classified into the frequency of out-patient department visit and hospitalization, as well as the purchase of over-the-counter drugs. During 12 months prior to interview, 79.8% of the participants attended health care service at least once. The rate of having any event in out-patients service visit, hospitalization, and over-the-counter drugs were 66.1%, 29.8% and 25.8% respectively. The frequency of health service utilization was associated with numerous factors. Among these factors, patients who were recruited from hospital and having a mood disorder were conjoint predictors of out-patient department visit, hospitalization and purchase of over-the-counter drugs. According to the results of this study, social education and routine screening for mood disorders can help heroin users to obtain adequate health care service. The findings of this study are useful references for targeting the heroin users for whom a successful intervention represents the greatest cost benefit. © 2013 Elsevier Ltd. All rights reserved.

  4. Motivation and intention to integrate physical activity into daily school life: the JAM World Record event.

    PubMed

    Vazou, Spyridoula; Vlachopoulos, Symeon P

    2014-11-01

    Research on the motivation of stakeholders to integrate physical activity into daily school life is limited. The purpose was to examine the motivation of stakeholders to participate in a world record physical activity event and whether motivation was associated with future intention to use activity breaks during the daily school life and future participation in a similar event. After the 2012 JAM (Just-a-Minute) World Record event, 686 adults (591 women; 76.1% participated for children <10 years) completed measures of motivational regulations and future intention to (a) use the activity breaks and (b) participate in the event. High intrinsic motivation and low extrinsic motivation and amotivation for participation in the next event were reported. Hierarchical regression analysis, controlling for age, gender, and occupation, showed that intrinsic forms of motivation positively predicted, whereas amotivation negatively predicted, future intention to participate in the event and use the activity breaks. Multivariate analyses of variance revealed that school-related participants were more intrinsically motivated and intended to use the activity breaks and repeat the event more than those who were not affiliated with a school. Nonschool participants reported higher extrinsic motivation and amotivation than school-related participants. © 2014 Society for Public Health Education.

  5. SCADA data and the quantification of hazardous events for QMRA.

    PubMed

    Nilsson, P; Roser, D; Thorwaldsdotter, R; Petterson, S; Davies, C; Signor, R; Bergstedt, O; Ashbolt, N

    2007-01-01

    The objective of this study was to assess the use of on-line monitoring to support the QMRA at water treatment plants studied in the EU MicroRisk project. SCADA data were obtained from three Catchment-to-Tap Systems (CTS) along with system descriptions, diary records, grab sample data and deviation reports. Particular attention was paid to estimating hazardous event frequency, duration and magnitude. Using Shewart and CUSUM we identified 'change-points' corresponding to events of between 10 min and >1 month duration in timeseries data. Our analysis confirmed it is possible to quantify hazardous event durations from turbidity, chlorine residual and pH records and distinguish them from non-hazardous variability in the timeseries dataset. The durations of most 'events' were short-term (0.5-2.3 h). These data were combined with QMRA to estimate pathogen infection risk arising from such events as chlorination failure. While analysis of SCADA data alone could identify events provisionally, its interpretation was severely constrained in the absence of diary records and other system information. SCADA data analysis should only complement traditional water sampling, rather than replace it. More work on on-line data management, quality control and interpretation is needed before it can be used routinely for event characterization.

  6. Emergence of heat extremes attributable to anthropogenic influences

    NASA Astrophysics Data System (ADS)

    King, Andrew D.; Black, Mitchell T.; Min, Seung-Ki; Fischer, Erich M.; Mitchell, Daniel M.; Harrington, Luke J.; Perkins-Kirkpatrick, Sarah E.

    2016-04-01

    Climate scientists have demonstrated that a substantial fraction of the probability of numerous recent extreme events may be attributed to human-induced climate change. However, it is likely that for temperature extremes occurring over previous decades a fraction of their probability was attributable to anthropogenic influences. We identify the first record-breaking warm summers and years for which a discernible contribution can be attributed to human influence. We find a significant human contribution to the probability of record-breaking global temperature events as early as the 1930s. Since then, all the last 16 record-breaking hot years globally had an anthropogenic contribution to their probability of occurrence. Aerosol-induced cooling delays the timing of a significant human contribution to record-breaking events in some regions. Without human-induced climate change recent hot summers and years would be very unlikely to have occurred.

  7. Stochastic filtering for damage identification through nonlinear structural finite element model updating

    NASA Astrophysics Data System (ADS)

    Astroza, Rodrigo; Ebrahimian, Hamed; Conte, Joel P.

    2015-03-01

    This paper describes a novel framework that combines advanced mechanics-based nonlinear (hysteretic) finite element (FE) models and stochastic filtering techniques to estimate unknown time-invariant parameters of nonlinear inelastic material models used in the FE model. Using input-output data recorded during earthquake events, the proposed framework updates the nonlinear FE model of the structure. The updated FE model can be directly used for damage identification and further used for damage prognosis. To update the unknown time-invariant parameters of the FE model, two alternative stochastic filtering methods are used: the extended Kalman filter (EKF) and the unscented Kalman filter (UKF). A three-dimensional, 5-story, 2-by-1 bay reinforced concrete (RC) frame is used to verify the proposed framework. The RC frame is modeled using fiber-section displacement-based beam-column elements with distributed plasticity and is subjected to the ground motion recorded at the Sylmar station during the 1994 Northridge earthquake. The results indicate that the proposed framework accurately estimate the unknown material parameters of the nonlinear FE model. The UKF outperforms the EKF when the relative root-mean-square error of the recorded responses are compared. In addition, the results suggest that the convergence of the estimate of modeling parameters is smoother and faster when the UKF is utilized.

  8. Precision Monitoring of Water Level in a Salt Marsh with Low Cost Tilt Loggers

    NASA Astrophysics Data System (ADS)

    Sheremet, Vitalii A.; Mora, Jordan W.

    2016-04-01

    Several salt pannes and pools in the Sage Lot tidal marsh of Waquoit Bay system, MA were instrumented with newly developed Arm-and-Float water level gauges (utilizing accelerometer tilt logger) permitting to record water level fluctuations with accuracy of 1 mm and submillimeter resolution. The methodology of the instrument calibration, deployment, and elevation control are described. The instrument performance was evaluated. Several month long deployments allowed us to analyze the marsh flooding and draining processes, study differences among the salt pannes. The open channel flow flooding-draining mechanism and slower seepage were distinguished. From the drain curve the seepage rate can be quantified. The seepage rate remains approximately constant for all flooding draining episodes, but varies from panne to panne depending on bottom type and location. Seasonal differences due to the growth of vegetation are also recorded. The analysis of rain events allows us to estimate the catch area of subbasins in the marsh. The implication for marsh ecology and marsh accretion are discussed. The gradual sea level rise coupled with monthly tidal datum variability and storm surges result in migration and development of a salt marsh. The newly developed low cost instrumentation allows us to record and analyze these changes and may provide guidance for the ecological management.

  9. Examining the utility of coral Ba/Ca as a proxy for river discharge and hydroclimate variability at Coiba Island, Gulf of Chirquí, Panamá.

    PubMed

    Brenner, Logan D; Linsley, Braddock K; Dunbar, Robert B

    2017-05-15

    Panamá's extreme hydroclimate seasonality is driven by Intertropical Convergence Zone rainfall and resulting runoff. River discharge (Q) carries terrestrially-derived barium to coastal waters that can be recorded in coral. We present a Ba/Ca record (1996-1917) generated from a Porites coral colony in the Gulf of Chiriquí near Coiba Island (Panamá) to understand regional hydroclimate. Here coral Ba/Ca is correlated to instrumental Q (R=0.67, p<0.001), producing a seasonally-resolved Reduced Major Axis regression of Ba/Ca (μmol/mol)=Q (m 3 /s)×0.006±0.001 (μmol/mol)(m 3 /s) -1 +4.579±0.151. Our results support work in the neighboring Gulf of Panamá that determined seawater Ba/Ca, controlled by Q, is correlated to coral Ba/Ca (LaVigne et al., 2016). Additionally, the Coiba coral Ba/Ca records at least 5 El Niño events and identified 22 of the 37 wet seasons with below average precipitation. These data corroborate the Q proxy and provide insight into the use of coral Ba/Ca as an El Niño and drought indicator. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Epileptic Seizure Detection with Log-Euclidean Gaussian Kernel-Based Sparse Representation.

    PubMed

    Yuan, Shasha; Zhou, Weidong; Wu, Qi; Zhang, Yanli

    2016-05-01

    Epileptic seizure detection plays an important role in the diagnosis of epilepsy and reducing the massive workload of reviewing electroencephalography (EEG) recordings. In this work, a novel algorithm is developed to detect seizures employing log-Euclidean Gaussian kernel-based sparse representation (SR) in long-term EEG recordings. Unlike the traditional SR for vector data in Euclidean space, the log-Euclidean Gaussian kernel-based SR framework is proposed for seizure detection in the space of the symmetric positive definite (SPD) matrices, which form a Riemannian manifold. Since the Riemannian manifold is nonlinear, the log-Euclidean Gaussian kernel function is applied to embed it into a reproducing kernel Hilbert space (RKHS) for performing SR. The EEG signals of all channels are divided into epochs and the SPD matrices representing EEG epochs are generated by covariance descriptors. Then, the testing samples are sparsely coded over the dictionary composed by training samples utilizing log-Euclidean Gaussian kernel-based SR. The classification of testing samples is achieved by computing the minimal reconstructed residuals. The proposed method is evaluated on the Freiburg EEG dataset of 21 patients and shows its notable performance on both epoch-based and event-based assessments. Moreover, this method handles multiple channels of EEG recordings synchronously which is more speedy and efficient than traditional seizure detection methods.

  11. Dynamic Statistical Characterization of Variation in Source Processes of Microseismic Events

    NASA Astrophysics Data System (ADS)

    Smith-Boughner, L.; Viegas, G. F.; Urbancic, T.; Baig, A. M.

    2015-12-01

    During a hydraulic fracture, water is pumped at high pressure into a formation. A proppant, typically sand is later injected in the hope that it will make its way into a fracture, keep it open and provide a path for the hydrocarbon to enter the well. This injection can create micro-earthquakes, generated by deformation within the reservoir during treatment. When these injections are monitored, thousands of microseismic events are recorded within several hundred cubic meters. For each well-located event, many source parameters are estimated e.g. stress drop, Savage-Wood efficiency and apparent stress. However, because we are evaluating outputs from a power-law process, the extent to which the failure is impacted by fluid injection or stress triggering is not immediately clear. To better detect differences in source processes, we use a set of dynamic statistical parameters which characterize various force balance assumptions using the average distance to the nearest event, event rate, volume enclosed by the events, cumulative moment and energy from a group of events. One parameter, the Fracability index, approximates the ratio of viscous to elastic forcing and highlights differences in the response time of a rock to changes in stress. These dynamic parameters are applied to a database of more than 90 000 events in a shale-gas play in the Horn River Basin to characterize spatial-temporal variations in the source processes. In order to resolve these differences, a moving window, nearest neighbour approach was used. First, the center of mass of the local distribution was estimated for several source parameters. Then, a set of dynamic parameters, which characterize the response of the rock were estimated. These techniques reveal changes in seismic efficiency and apparent stress and often coincide with marked changes in the Fracability index and other dynamic statistical parameters. Utilizing these approaches allowed for the characterization of fluid injection related processes.

  12. Adverse Event Rates Associated with Transforaminal and Interlaminar Epidural Steroid Injections: A Multi-Institutional Study.

    PubMed

    El-Yahchouchi, Christine A; Plastaras, Christopher T; Maus, Timothy P; Carr, Carrie M; McCormick, Zachary L; Geske, Jennifer R; Smuck, Matthew; Pingree, Matthew J; Kennedy, David J

    2016-02-01

    Transforaminal epidural steroid injections (TFESI) have demonstrated efficacy and effectiveness in treatment of radicular pain. Despite little evidence of efficacy/effectiveness, interlaminar epidural steroid injections (ILESI) are advocated by some as primary therapy for radicular pain due to purported greater safety. To assess immediate and delayed adverse event rates of TFESI and ILESI injections at three academic medical centers utilizing International Spine Intervention Society practice guidelines. Quality assurance databases from a Radiology and two physical medicine and rehabilitation (PM&R) practices were interrogated. Medical records were reviewed, verifying immediate and delayed adverse events. There were no immediate major adverse events of neurologic injury or hemorrhage in 16,638 consecutive procedures in all spine segments (14,956 TFESI; 1,682 ILESI). Vasovagal reactions occurred in 1.2% of procedures, more frequently (P = 0.004) in TFESI (1.3%) than ILESI (0.5%). Dural punctures occurred in 0.06% of procedures, more commonly after ILESI (0.2% vs 0.04%, P = 0.006). Delayed follow up on PM&R patients (92.5% and 78.5, next business day) and radiology patients (63.1%, 2 weeks) identified no major adverse events of neurologic injury, hemorrhage, or infection. There were no significant differences in delayed minor adverse event rates. Central steroid response (sleeplessness, flushing, nonpositional headache) was seen in 2.6% of both TFESI and ILESI patients. 2.1% of TFESI and 1.8% of ILESI patients reported increased pain. No long-term sequelae were seen from any immediate or delayed minor adverse event. Both transforaminal and ILESI are safely performed with low immediate and delayed adverse event rates when informed by evidence-based procedural guidelines. By demonstrating comparable safety, this study suggests that the choice between ILESI and TFESIs can be based on documented efficacy and effectiveness and not driven by safety concerns.

  13. Demonstration of quantum synchronization based on second-order quantum coherence of entangled photons

    PubMed Central

    Quan, Runai; Zhai, Yiwei; Wang, Mengmeng; Hou, Feiyan; Wang, Shaofeng; Xiang, Xiao; Liu, Tao; Zhang, Shougang; Dong, Ruifang

    2016-01-01

    Based on the second-order quantum interference between frequency entangled photons that are generated by parametric down conversion, a quantum strategic algorithm for synchronizing two spatially separated clocks has been recently presented. In the reference frame of a Hong-Ou-Mandel (HOM) interferometer, photon correlations are used to define simultaneous events. Once the HOM interferometer is balanced by use of an adjustable optical delay in one arm, arrival times of simulta- neously generated photons are recorded by each clock. The clock offset is determined by correlation measurement of the recorded arrival times. Utilizing this algorithm, we demonstrate a proof-of-principle experiment for synchronizing two clocks separated by 4 km fiber link. A minimum timing stability of 0.44 ps at averaging time of 16000 s is achieved with an absolute time accuracy of 73.2 ps. The timing stability is verified to be limited by the correlation measurement device and ideally can be better than 10 fs. Such results shine a light to the application of quantum clock synchronization in the real high-accuracy timing system. PMID:27452276

  14. When leaving your ex, love yourself: observational ratings of self-compassion predict the course of emotional recovery following marital separation.

    PubMed

    Sbarra, David A; Smith, Hillary L; Mehl, Matthias R

    2012-03-01

    Divorce is a highly stressful event, and much remains to be learned about the factors that promote psychological resilience when marriages come to an end. In this study, divorcing adults (N = 109) completed a 4-min stream-of-consciousness recording about their marital separation at an initial laboratory visit. Four judges rated the degree to which participants exhibited self-compassion (defined by self-kindness, an awareness of one's place in shared humanity, and emotional equanimity) in their recordings. Judges evidenced considerable agreement in their ratings of participants' self-compassion, and these ratings demonstrated strong predictive utility: Higher levels of self-compassion at the initial visit were associated with less divorce-related emotional intrusion into daily life at the start of the study, and this effect persisted up to 9 months later. These effects held when we accounted for a number of competing predictors. Self-compassion is a modifiable variable, and if our findings can be replicated, they may have implications for improving the lives of divorcing adults.

  15. Digitizing Medicines for Remote Capture of Oral Medication Adherence Using Co‐encapsulation

    PubMed Central

    Peloquin, C; Santillo, F; Haubrich, R; Muttera, L; Moser, K; Savage, GM; Benson, CA; Blaschke, TF

    2017-01-01

    High‐resolution measurement of medication adherence is essential to personalized drug therapy. A US Food and Drug Administration (FDA)‐cleared device, using an edible ingestion sensor (IS), external wearable patch, and paired mobile device can detect and record ingestion events. Oral medications must be combined with an IS to generate precise “digitized‐medication” ingestion records. We developed a Good Manufacturing Practice protocol to repackage oral medications with the IS within certified Capsugel capsules, termed co‐encapsulation (CoE). A randomized bioequivalence study of CoE‐IS‐Rifamate (Isoniazid/Rifampin 150/300 mg) vs. native‐Rifamate was conducted in 12 patients with active Mycobacterium tuberculosis and demonstrated bioequivalence using the population method ratio test (95% confidence interval). Subsequently, CoE‐IS‐medications across all biopharmaceutical classes underwent in vitro dissolution testing utilizing USP and FDA guidelines. CoE‐IS medications tested met USP dissolution specifications and were equivalent to their native formulations. CoE combines oral medications with the IS without altering the quality of the native formulation, generating “digitized” medications for remote capture of dosing histories. PMID:28597911

  16. Building an Ontology for Identity Resolution in Healthcare and Public Health.

    PubMed

    Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P; Clyde, Stephen; Thornton, Sidney; Staes, Catherine

    2015-01-01

    Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology's ability to model identity-changing events over time. We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage.

  17. A novel method for inferring RFID tag reader recordings into clinical events.

    PubMed

    Chang, Yung-Ting; Syed-Abdul, Shabbir; Tsai, Chung-You; Li, Yu-Chuan

    2011-12-01

    Nosocomial infections (NIs) are among the important indicators used for evaluating patients' safety and hospital performance during accreditation of hospitals. NI rate is higher in Intensive Care Units (ICUs) than in the general wards because patients require intense care involving both invasive and non-invasive clinical procedures. The emergence of Superbugs is motivating health providers to enhance infection control measures. Contact behavior between health caregivers and patients is one of the main causes of cross infections. In this technology driven era remote monitoring of patients and caregivers in the hospital setting can be performed reliably, and thus is in demand. Proximity sensing using radio frequency identification (RFID) technology can be helpful in capturing and keeping track on all contact history between health caregivers and patients for example. This study intended to extend the use of proximity sensing of radio frequency identification technology by proposing a model for inferring RFID tag reader recordings into clinical events. The aims of the study are twofold. The first aim is to set up a Contact History Inferential Model (CHIM) between health caregivers and patients. The second is to verify CHIM with real-time observation done at the ICU ward. A pre-study was conducted followed by two study phases. During the pre-study proximity sensing of RFID was tested, and deployment of the RFID in the Clinical Skill Center in one of the medical centers in Taiwan was done. We simulated clinical events and developed CHIM using variables such as duration of time, frequency, and identity (tag) numbers assigned to caregivers. All clinical proximity events are classified into close-in events, contact events and invasive events. During the first phase three observers were recruited to do real time recordings of all clinical events in the Clinical Skill Center with the deployed automated RFID interaction recording system. The observations were used to verify the CHIM recordings. In second phase the first author conducted 40 h of participatory observation in the ICU, and observed values that were used as golden standard to validate CHIM. There were a total of 193 events to validate the CHIM in the second phase. The sensitivity, specificity, and accuracy of close-in events were 73.8%, 83.8%, and 81.6%; contact events were 81.4%, 78.8%, and 80.7%; and invasive events were 90.9%, 98.0%, and 97.5% respectively. The results of the study indicated that proximity sensing of the RFID detects proximity events effectively, and the CHIM can infer proximity events accurately. RFID technology can be used for recording complete clinical contact history between caregivers and patients thus assisting in tracing cause of NIs. Since this model could infer the ICU activities accurately, we are convinced that the CHIM can also be applied in other wards and can be used for additional purposes. 2011 Elsevier Ireland Ltd. All rights reserved.

  18. News Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

    NASA Astrophysics Data System (ADS)

    2011-01-01

    Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events

  19. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    PubMed

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  20. Some Physics Constraints on Ultimate Achievement in Track and Field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frohlich, Cliff

    2009-02-06

    World records in track and field have improved remarkably throughout the last 100 years; however, in several events physics places quite strict limitations on ultimate performance. For example, analysis suggests that records in broad jump and pole vault have approached their optimum possible values. Physical constraints are more subtle for events such as javelin, high jump, and the distance races, and thus there may be opportunities for “breakthroughs” in current records. Considering that there is enormous cultural interest and economic expenditure on sports, for most events the level of scientific analysis isn’t very high. This presents a research opportunity formore » fans who are engineers or physicists.« less

  1. University of Michigan lecture archiving and related activities of the U-M ATLAS Collaboratory Project

    NASA Astrophysics Data System (ADS)

    Herr, J.; Bhatnagar, T.; Goldfarb, S.; Irrer, J.; McKee, S.; Neal, H. A.

    2008-07-01

    Large scientific collaborations as well as universities have a growing need for multimedia archiving of meetings and courses. Collaborations need to disseminate training and news to their wide-ranging members, and universities seek to provide their students with more useful studying tools. The University of Michigan ATLAS Collaboratory Project has been involved in the recording and archiving of multimedia lectures since 1999. Our software and hardware architecture has been used to record events for CERN, ATLAS, many units inside the University of Michigan, Fermilab, the American Physical Society and the International Conference on Systems Biology at Harvard. Until 2006 our group functioned primarily as a tiny research/development team with special commitments to the archiving of certain ATLAS events. In 2006 we formed the MScribe project, using a larger scale, and highly automated recording system to record and archive eight University courses in a wide array of subjects. Several robotic carts are wheeled around campus by unskilled student helpers to automatically capture and post to the Web audio, video, slides and chalkboard images. The advances the MScribe project has made in automation of these processes, including a robotic camera operator and automated video processing, are now being used to record ATLAS Collaboration events, making them available more quickly than before and enabling the recording of more events.

  2. Regional Arctic and Hemispheric Teleconnections expressed in the paleoenvironmental record of El'gygytgyn Lake, NE Russia

    NASA Astrophysics Data System (ADS)

    Brigham-Grette, J.; Melles, M.; Deconto, R.; Koenig, S.

    2007-12-01

    The common goal of recovering long high-resolution records is in testing relevant questions of Earth system dynamics, as well as documenting the drivers of regional and global scale change. Lake El'gygytgyn, located 100 km north of the Arctic Circle in NE Russia is a target for deep drilling a continuous record back to ~3.6 My in Spring 2009. Pilot cores dating to 250ka to 300 ka provide the impetus for evaluating the sensitivity of the Arctic to regional and global climate events on millennial timescales. A clear record of the Younger Dryas, rapid change within MIS 3, and events including interstadials 19, 20, events within Stage 5, and at the end of stage 6 seen in Greenland and marine records suggest that oceanographic and atmospheric changes over the North Atlantic are reflected in hydrologic and seasonal temperature proxies. Rapid events are recorded despite demonstrated precessional influences and the suggested upwind influence of the Eurasian Ice sheet and dramatic changes in continentality due to changes in sea level across the Bering/Chukchi shelves and the extent and seasonal persistence of sea ice in the Arctic Ocean and deeper Bering Sea. Regionally, lake cores throughout Beringia reflect patterns of precipitation and temperature that point to persistent zonal differences in the response of the landscape to environmental change.

  3. Tales from the South (and West) Pacific in the Common Era: A Climate Proxy Perspective (Invited)

    NASA Astrophysics Data System (ADS)

    Quinn, T. M.; Taylor, F. W.; Partin, J. W.; Maupin, C. R.; Hereid, K. A.; Gorman, M. K.

    2010-12-01

    The southwest Pacific is a major source of tropical climate variability through heat and moisture exchanges associated with the Western Pacific Warm Pool (WPWP) and the South Pacific Convergence Zone (SPCZ). These variations are especially significant at the annual, interannual (El Niño-Southern Oscillation, ENSO), and multi-decadal timescales. Gridded SST data products are available in the pre-satellite era in this region for the past ~130 years, although data density is a significant issue for the older half of these records. Time series of salinity (SSS) and rainfall from this region are exceedingly rare. Thus, climate proxy records must be used to reconstruct SST, SSS, and rainfall variations in the Common Era (CE) in the tropical Pacific. The analytical laboratory for paleoclimate studies at UT has focused its research efforts into producing climate proxy time series from southwest tropical Pacific using modern and fossil corals, and speleothems. Our most recent results are summarized in this presentation, although much of this work is still in progress. Coral climate records have been generated from Sabine Bank, Vanuatu (16°S, 166°E) and Misima Island, Papua New Guinea (10.6°S, 152.8°E). The Vanuatu coral record of monthly resolved Sr/Ca variations extends back to the late 18th century. All strong ENSO warm phase events of the 20th century observed in the instrumental record are also observed in the coral record. We note that several ENSO warm phase events in the 19th century portion of the coral record are comparable in size to that recorded in response to the 1982/1983 and 1997/1998 events. The Misima coral record of monthly resolved δ18O and Sr/Ca variations spans the interval ~1414-1645 CE — the heart of the Little Ice Age. Amplitude modulation of interannual variability is observed in this LIA record, much like what is observed during the relatively quiescent period of 1920-1950 in the 20th century instrumental and proxy records of ENSO. However, the amplitude of individual ENSO warm phase events in the LIA record is reduced, relative to that of the 1941/1942 ENSO warm phase events observed in a near modern coral record from Misima. Speleothem climate records have been generated from Espirito Santo, Vanuatu (15.5°S, 167°E) and Guadalcanal, Solomon Islands (~9°S, 160°E). The Vanuatu record of δ18O variations is from a fast-growing speleothem (~1-3 mm/year), which yields a record of rainfall variability spanning ~1670-2005 CE, as dated by U-Th disequilibrium techniques. Interannual changes in speleothem δ18O appear to capture ENSO events and subsequent reorganizations of the SPCZ. The Vanuatu speleothem δ18O record also exhibits concentrations of variance on the decadal scale. The Guadalcanal record of δ18O variations is also from a fast-growing speleothem (~1-4 mm/year), which yields a record of rainfall variability spanning ~1650-2010 CE, as dated by U-Th disequilibrium techniques. The δ18O records from both of these stalagmites provide evidence for changes in convection in the equatorial WPWP region of the SPCZ: the rising limb of the Pacific Walker Circulation.

  4. Utilizing the PREPaRE Model When Multiple Classrooms Witness a Traumatic Event

    ERIC Educational Resources Information Center

    Bernard, Lisa J.; Rittle, Carrie; Roberts, Kathy

    2011-01-01

    This article presents an account of how the Charleston County School District responded to an event by utilizing the PREPaRE model (Brock, et al., 2009). The acronym, PREPaRE, refers to a range of crisis response activities: P (prevent and prepare for psychological trauma), R (reaffirm physical health and perceptions of security and safety), E…

  5. Pathways to the Principalship: An Event History Analysis of the Careers of Teachers with Principal Certification

    ERIC Educational Resources Information Center

    Davis, Bradley W.; Gooden, Mark A.; Bowers, Alex J.

    2017-01-01

    Utilizing rich data on nearly 11,000 educators over 17 academic years in a highly diverse context, we examine the career paths of teachers to determine whether and when they transition into the principalship. We utilize a variety of event history analyses, including discrete-time hazard modeling, to determine how an individual's race, gender, and…

  6. Real-time data acquisition and control system for the measurement of motor and neural data

    PubMed Central

    Bryant, Christopher L.; Gandhi, Neeraj J.

    2013-01-01

    This paper outlines a powerful, yet flexible real-time data acquisition and control system for use in the triggering and measurement of both analog and digital events. Built using the LabVIEW development architecture (version 7.1) and freely available, this system provides precisely timed auditory and visual stimuli to a subject while recording analog data and timestamps of neural activity retrieved from a window discriminator. The system utilizes the most recent real-time (RT) technology in order to provide not only a guaranteed data acquisition rate of 1 kHz, but a much more difficult to achieve guaranteed system response time of 1 ms. The system interface is windows-based and easy to use, providing a host of configurable options for end-user customization. PMID:15698659

  7. Nanosecond time resolved x-ray diagnostics of relativistic electron beam initiated events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuswa, Glenn W.; Chang, James

    The dynamic behavior of a test sample during aid shortly after it has teen irradiated by an intense relativistic electron beam (REB) is of great interest to the study of team energy deposition. Since the sample densities are far beyond the cutoff in the optical region, flash x-radiography techniques have been developed to diagnose the evolution of the samples. The conventional approach of analyzing the dynamic behavior of solid densities utilizes one or more short x-ray bursts to record images on photographic emulsion. This technique is not useful in the presence of the intense x-rays from the REB interacting withmore » the sample. We report two techniques for isolating the film package from the REB x-ray pulse.« less

  8. Marshall Space Flight Center 1960-1985: 25th anniversary report

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Marshall Space FLight Center marks its 25th aniversary with a record of notable achievements. These accomplishments are the essence of the Marshall Center's history. Behind the scenes of the space launches and missions, however, lies the story of challenges faced and problems solved. The highlights of that story are presented. The story is organized not as a straight chronology but as three parallel reviews of the major assignments: propulsion systems and launch vehicles, space science research and technology, and manned space systems. The general goals were to reach space, to know and understand the space environment, and to inhabit and utilize space for the benefit of mankind. Also included is a chronology of major events, presented as a fold-out chart for ready reference.

  9. Spectrum slicer for snapshot spectral imaging

    NASA Astrophysics Data System (ADS)

    Tamamitsu, Miu; Kitagawa, Yutaro; Nakagawa, Keiichi; Horisaki, Ryoichi; Oishi, Yu; Morita, Shin-ya; Yamagata, Yutaka; Motohara, Kentaro; Goda, Keisuke

    2015-12-01

    We propose and demonstrate an optical component that overcomes critical limitations in our previously demonstrated high-speed multispectral videography-a method in which an array of periscopes placed in a prism-based spectral shaper is used to achieve snapshot multispectral imaging with the frame rate only limited by that of an image-recording sensor. The demonstrated optical component consists of a slicing mirror incorporated into a 4f-relaying lens system that we refer to as a spectrum slicer (SS). With its simple design, we can easily increase the number of spectral channels without adding fabrication complexity while preserving the capability of high-speed multispectral videography. We present a theoretical framework for the SS and its experimental utility to spectral imaging by showing real-time monitoring of a dynamic colorful event through five different visible windows.

  10. The little ice age as recorded in the stratigraphy of the tropical quelccaya ice cap.

    PubMed

    Thompson, L G; Mosley-Thompson, E; Dansgaard, W; Grootes, P M

    1986-10-17

    The analyses of two ice cores from a southern tropical ice cap provide a record of climatic conditions over 1000 years for a region where other proxy records are nearly absent. Annual variations in visible dust layers, oxygen isotopes, microparticle concentrations, conductivity, and identification of the historical (A.D. 1600) Huaynaputina ash permit accurate dating and time-scale verification. The fact that the Little Ice Age (about A.D. 1500 to 1900) stands out as a significant climatic event in the oxygen isotope and electrical conductivity records confirms the worldwide character of this event.

  11. 78 FR 62605 - Privacy Act of 1974; System of Records-Office of Hearings and Appeals (OHA) Records System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ... entity's jurisdiction. (4) Enforcement Disclosure. In the event that information in this system of...) Litigation and Alternative Dispute Resolution (ADR) Disclosures. (a) Introduction. In the event that one of....S.C. chapter 71 when relevant and necessary to their duties of exclusive representation. (9) Freedom...

  12. Developmental Regression and Autism Reported to the Vaccine Adverse Event Reporting System

    ERIC Educational Resources Information Center

    Woo, Emily Jane; Ball, Robert; Landa, Rebecca; Zimmerman, Andrew W.; Braun, M. Miles

    2007-01-01

    We report demographic and clinical characteristics of children reported to the US Vaccine Adverse Event Reporting System (VAERS) as having autism or another developmental disorder after vaccination. We completed 124 interviews with parents and reviewed medical records for 31 children whose records contained sufficient information to evaluate the…

  13. 49 CFR 563.9 - Data capture.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 6 2011-10-01 2011-10-01 false Data capture. 563.9 Section 563.9 Transportation..., DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.9 Data capture. Link to an amendment published at 76 FR 47489, Aug. 5, 2011. The EDR must capture and record the data elements for events in accordance...

  14. Readiness of the ATLAS detector: Performance with the first beam and cosmic data

    NASA Astrophysics Data System (ADS)

    Pastore, F.

    2010-05-01

    During 2008 the ATLAS experiment went through an intense period of preparation to have the detector fully commissioned for the first beam period. In about 30 h of beam time available to ATLAS in 2008 the systems went through a rapid setup sequence, from successfully recording the first bunch ever reaching ATLAS, to setting up the timing of the trigger system synchronous to the incoming single beams. The so-called splash events were recorded, where the beam was stopped on a collimator 140 m upstream of ATLAS, showering the experiment with millions of particles per beam shot. These events were found to be extremely useful for timing setup. After the stop of the beam operation, the experiment went through an extensive cosmic ray data taking campaign, recording more than 500 million cosmic ray events. These events have been used to make significant progress on the calibration and alignment of the detector. This paper describes the commissioning programme and the results obtained from both the single beam data and the cosmic data recorded in 2008.

  15. Cosmic ray event in 994 C.E. recorded in radiocarbon from Danish oak

    NASA Astrophysics Data System (ADS)

    Fogtmann-Schulz, A.; Østbø, S. M.; Nielsen, S. G. B.; Olsen, J.; Karoff, C.; Knudsen, M. F.

    2017-08-01

    We present measurements of radiocarbon in annual tree rings from the time period 980-1006 Common Era (C.E.), hereby covering the cosmic ray event in 994 C.E. The new radiocarbon record from Danish oak is based on both earlywood and latewood fractions of the tree rings, which makes it possible to study seasonal variations in 14C production. The measurements show a rapid increase of ˜10‰ from 993 to 994 C.E. in latewood, followed by a modest decline and relatively high values over the ensuing ˜10 years. This rapid increase occurs from 994 to 995 C.E. in earlywood, suggesting that the cosmic ray event most likely occurred during the period between April and June 994 C.E. Our new record from Danish oak shows strong agreement with existing Δ14C records from Japan, thus supporting the hypothesis that the 994 C.E. cosmic ray event was uniform throughout the Northern Hemisphere and therefore can be used as an astrochronological tie point to anchor floating chronologies of ancient history.

  16. 18 CFR 125.2 - General instructions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., DEPARTMENT OF ENERGY ACCOUNTS, FEDERAL POWER ACT PRESERVATION OF -RECORDS OF PUBLIC UTILITIES AND LICENSEES... books of account and other records prepared by or on behalf of the public utility or licensee. See item... appropriate in the public interest or for the protection of investors or consumers. (b) Designation of...

  17. 15 CFR 784.3 - Scope and conduct of complementary access.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Utilize radiation detection and measurement devices; (iii) Utilize non-destructive measurements and sampling; (iv) Examine relevant records (i.e., records appropriate for the purpose of complementary access...); (v) Perform location-specific environmental sampling; and Note to § 784.3(b)(1)(v): BIS will not seek...

  18. A strong-motion database from the Central American subduction zone

    NASA Astrophysics Data System (ADS)

    Arango, Maria Cristina; Strasser, Fleur O.; Bommer, Julian J.; Hernández, Douglas A.; Cepeda, Jose M.

    2011-04-01

    Subduction earthquakes along the Pacific Coast of Central America generate considerable seismic risk in the region. The quantification of the hazard due to these events requires the development of appropriate ground-motion prediction equations, for which purpose a database of recordings from subduction events in the region is indispensable. This paper describes the compilation of a comprehensive database of strong ground-motion recordings obtained during subduction-zone events in Central America, focusing on the region from 8 to 14° N and 83 to 92° W, including Guatemala, El Salvador, Nicaragua and Costa Rica. More than 400 accelerograms recorded by the networks operating across Central America during the last decades have been added to data collected by NORSAR in two regional projects for the reduction of natural disasters. The final database consists of 554 triaxial ground-motion recordings from events of moment magnitudes between 5.0 and 7.7, including 22 interface and 58 intraslab-type events for the time period 1976-2006. Although the database presented in this study is not sufficiently complete in terms of magnitude-distance distribution to serve as a basis for the derivation of predictive equations for interface and intraslab events in Central America, it considerably expands the Central American subduction data compiled in previous studies and used in early ground-motion modelling studies for subduction events in this region. Additionally, the compiled database will allow the assessment of the existing predictive models for subduction-type events in terms of their applicability for the Central American region, which is essential for an adequate estimation of the hazard due to subduction earthquakes in this region.

  19. Cue Utilization and Cognitive Load in Novel Task Performance

    PubMed Central

    Brouwers, Sue; Wiggins, Mark W.; Helton, William; O’Hare, David; Griffin, Barbara

    2016-01-01

    This study was designed to examine whether differences in cue utilization were associated with differences in performance during a novel, simulated rail control task, and whether these differences reflected a reduction in cognitive load. Two experiments were conducted, the first of which involved the completion of a 20-min rail control simulation that required participants to re-route trains that periodically required a diversion. Participants with a greater level of cue utilization recorded a consistently greater response latency, consistent with a strategy that maintained accuracy, but reduced the demands on cognitive resources. In the second experiment, participants completed the rail task, during which a concurrent, secondary task was introduced. The results revealed an interaction, whereby participants with lesser levels of cue utilization recorded an increase in response latency that exceeded the response latency recorded for participants with greater levels of cue utilization. The relative consistency of response latencies for participants with greater levels of cue utilization, across all blocks, despite the imposition of a secondary task, suggested that those participants with greater levels of cue utilization had adopted a strategy that was effectively minimizing the impact of additional sources of cognitive load on their performance. PMID:27064669

  20. Plant microfossil record of the terminal Cretaceous event in the western United States and Canada

    NASA Technical Reports Server (NTRS)

    Nichols, D. J.; Fleming, R. F.

    1988-01-01

    Plant microfossils, principally pollen grains and spores produced by land plants, provide an excellent record of the terminal Cretaceous event in nonmarine environments. The record indicates regional devastation of the latest Cretaceous vegetation with the extinction of many groups, followed by a recolonization of the earliest Tertiary land surface, and development of a permanently changed land flora. The regional variations in depositional environments, plant communities, and paleoclimates provide insight into the nature and effects of the event, which were short-lived but profound. The plant microfossil data support the hypothesis that an abruptly initiated, major ecological crisis occurred at the end of the Cretaceous. Disruption of the Late Cretaceous flora ultimately contributred to the rise of modern vegetation. The plant microfossils together with geochemical and mineralogical data are consistent with an extraterrestrial impact having been the cause of the terminal Cretaceous event.

  1. Early Detection of Heart Failure Using Electronic Health Records: Practical Implications for Time Before Diagnosis, Data Diversity, Data Quantity, and Data Density.

    PubMed

    Ng, Kenney; Steinhubl, Steven R; deFilippi, Christopher; Dey, Sanjoy; Stewart, Walter F

    2016-11-01

    Using electronic health records data to predict events and onset of diseases is increasingly common. Relatively little is known, although, about the tradeoffs between data requirements and model utility. We examined the performance of machine learning models trained to detect prediagnostic heart failure in primary care patients using longitudinal electronic health records data. Model performance was assessed in relation to data requirements defined by the prediction window length (time before clinical diagnosis), the observation window length (duration of observation before prediction window), the number of different data domains (data diversity), the number of patient records in the training data set (data quantity), and the density of patient encounters (data density). A total of 1684 incident heart failure cases and 13 525 sex, age-category, and clinic matched controls were used for modeling. Model performance improved as (1) the prediction window length decreases, especially when <2 years; (2) the observation window length increases but then levels off after 2 years; (3) the training data set size increases but then levels off after 4000 patients; (4) more diverse data types are used, but, in order, the combination of diagnosis, medication order, and hospitalization data was most important; and (5) data were confined to patients who had ≥10 phone or face-to-face encounters in 2 years. These empirical findings suggest possible guidelines for the minimum amount and type of data needed to train effective disease onset predictive models using longitudinal electronic health records data. © 2016 American Heart Association, Inc.

  2. Measuring Holocene Indian Summer Monsoon Precipitation through Lake Sedimentary Proxies, Eastern Tibet

    NASA Astrophysics Data System (ADS)

    Perello, M. M.; Bird, B. W.; Lei, Y.; Polissar, P. J.; Thompson, L. G.; Yao, T.

    2017-12-01

    The Tibetan Plateau is the headwaters of several major river systems in South Asia, which serve as essential water resources for more than 40% of the world's population. The majority of regional precipitation that sustains these water resources is from the Indian summer monsoon (ISM), which can experience considerably variability in response to local and remote forcings and teleconnections. Despite the ISM's importance, its sensitivity to long term and abrupt changes in climatic boundary conditions is not well established with the modern instrumental record or the available body of paleoclimate data. Here, we present results from an ongoing study that utilizes lake sediment records to provide a longer record of relative levels of precipitation and lake level during the monsoon season. The sediments cores used in this study were collected from five lakes along an east-west transect in the Eastern Tibetan Plateau (87-95°E). Using these records, we assess temporal and spatial variability in the intensity of the ISM throughout the Holocene on decadal frequencies. Multiple proxies, including sedimentology, grain size, geochemistry, terrestrial and aquatic leaf wax isotopes, and diatom community assemblages, are used to assess paleo-precipitation and lake level. Preliminary records from our lakes indicate regional trends in monsoon strength, with higher lake levels in the Early Holocene, but with greater variability in the Late Holocene than in other regional paleoclimate records. We have also observed weak responses in our lakes to the Late Holocene events, the Medieval Climate Anomaly and the Little Ice Age. These paleoclimate reconstructions furthers our understanding of strong versus weak monsoon intensities and can be incorporated in climate models for predicting future monsoon conditions.

  3. Identifying patient safety problems associated with information technology in general practice: an analysis of incident reports.

    PubMed

    Magrabi, Farah; Liaw, Siaw Teng; Arachi, Diana; Runciman, William; Coiera, Enrico; Kidd, Michael R

    2016-11-01

    To identify the categories of problems with information technology (IT), which affect patient safety in general practice. General practitioners (GPs) reported incidents online or by telephone between May 2012 and November 2013. Incidents were reviewed against an existing classification for problems associated with IT and the clinical process impacted. 87 GPs across Australia. Types of problems, consequences and clinical processes. GPs reported 90 incidents involving IT which had an observable impact on the delivery of care, including actual patient harm as well as near miss events. Practice systems and medications were the most affected clinical processes. Problems with IT disrupted clinical workflow, wasted time and caused frustration. Issues with user interfaces, routine updates to software packages and drug databases, and the migration of records from one package to another generated clinical errors that were unique to IT; some could affect many patients at once. Human factors issues gave rise to some errors that have always existed with paper records but are more likely to occur and cause harm with IT. Such errors were linked to slips in concentration, multitasking, distractions and interruptions. Problems with patient identification and hybrid records generated errors that were in principle no different to paper records. Problems associated with IT include perennial risks with paper records, but additional disruptions in workflow and hazards for patients unique to IT, occasionally affecting multiple patients. Surveillance for such hazards may have general utility, but particularly in the context of migrating historical records to new systems and software updates to existing systems. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Future of electronic health records: implications for decision support.

    PubMed

    Rothman, Brian; Leonard, Joan C; Vigoda, Michael M

    2012-01-01

    The potential benefits of the electronic health record over traditional paper are many, including cost containment, reductions in errors, and improved compliance by utilizing real-time data. The highest functional level of the electronic health record (EHR) is clinical decision support (CDS) and process automation, which are expected to enhance patient health and healthcare. The authors provide an overview of the progress in using patient data more efficiently and effectively through clinical decision support to improve health care delivery, how decision support impacts anesthesia practice, and how some are leading the way using these systems to solve need-specific issues. Clinical decision support uses passive or active decision support to modify clinician behavior through recommendations of specific actions. Recommendations may reduce medication errors, which would result in considerable savings by avoiding adverse drug events. In selected studies, clinical decision support has been shown to decrease the time to follow-up actions, and prediction has proved useful in forecasting patient outcomes, avoiding costs, and correctly prompting treatment plan modifications by clinicians before engaging in decision-making. Clinical documentation accuracy and completeness is improved by an electronic health record and greater relevance of care data is delivered. Clinical decision support may increase clinician adherence to clinical guidelines, but educational workshops may be equally effective. Unintentional consequences of clinical decision support, such as alert desensitization, can decrease the effectiveness of a system. Current anesthesia clinical decision support use includes antibiotic administration timing, improved documentation, more timely billing, and postoperative nausea and vomiting prophylaxis. Electronic health record implementation offers data-mining opportunities to improve operational, financial, and clinical processes. Using electronic health record data in real-time for decision support and process automation has the potential to both reduce costs and improve the quality of patient care. © 2012 Mount Sinai School of Medicine.

  5. Tracing the incorporation of carbon into benthic foraminiferal calcite following the Deepwater Horizon event.

    PubMed

    Schwing, Patrick T; Chanton, Jeffrey P; Romero, Isabel C; Hollander, David J; Goddard, Ethan A; Brooks, Gregg R; Larson, Rebekka A

    2018-06-01

    Following the Deepwater Horizon (DWH) event in 2010, hydrocarbons were deposited on the continental slope in the northeastern Gulf of Mexico through marine oil snow sedimentation and flocculent accumulation (MOSSFA). The objective of this study was to test the hypothesis that benthic foraminiferal δ 13 C would record this depositional event. From December 2010 to August 2014, a time-series of sediment cores was collected at two impacted sites and one control site in the northeastern Gulf of Mexico. Short-lived radioisotopes ( 210 Pb and 234 Th) were employed to establish the pre-DWH, DWH, and post-DWH intervals. Benthic foraminifera (Cibicidoides spp. and Uvigerina spp.) were isolated from these intervals for δ 13 C measurement. A modest (0.2-0.4‰), but persistent δ 13 C depletion in the DWH intervals of impacted sites was observed over a two-year period. This difference was significantly beyond the pre-DWH (background) variability and demonstrated that benthic foraminiferal calcite recorded the depositional event. The longevity of the depletion in the δ 13 C record suggested that benthic foraminifera may have recorded the change in organic matter caused by MOSSFA from 2010 to 2012. These findings have implications for assessing the subsurface spatial distribution of the DWH MOSSFA event. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiogama, Hideo; Imada, Yukiko; Mori, Masato

    Here, we describe two unprecedented large (100-member), longterm (61-year) ensembles based on MRI-AGCM3.2, which were driven by historical and non-warming climate forcing. These ensembles comprise the "Database for Policy Decision making for Future climate change (d4PDF)". We compare these ensembles to large ensembles based on another climate model, as well as to observed data, to investigate the influence of anthropogenic activities on historical changes in the numbers of record-breaking events, including: the annual coldest daily minimum temperature (TNn), the annual warmest daily maximum temperature (TXx) and the annual most intense daily precipitation event (Rx1day). These two climate model ensembles indicatemore » that human activity has already had statistically significant impacts on the number of record-breaking extreme events worldwide mainly in the Northern Hemisphere land. Specifically, human activities have altered the likelihood that a wider area globally would suffer record-breaking TNn, TXx and Rx1day events than that observed over the 2001- 2010 period by a factor of at least 0.6, 5.4 and 1.3, respectively. However, we also find that the estimated spatial patterns and amplitudes of anthropogenic impacts on the probabilities of record-breaking events are sensitive to the climate model and/or natural-world boundary conditions used in the attribution studies.« less

  7. 40 CFR 86.1370-2007 - Not-To-Exceed test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... that include discrete regeneration events and that send a recordable electronic signal indicating the start and end of the regeneration event, determine the minimum averaging period for each NTE event that... averaging period is used to determine whether the individual NTE event is a valid NTE event. For engines...

  8. The 9.2 ka event in Asian summer monsoon area: the strongest millennial scale collapse of the monsoon during the Holocene

    NASA Astrophysics Data System (ADS)

    Zhang, Wenchao; Yan, Hong; Dodson, John; Cheng, Peng; Liu, Chengcheng; Li, Jianyong; Lu, Fengyan; Zhou, Weijian; An, Zhisheng

    2018-04-01

    Numerous Holocene paleo-proxy records exhibit a series of centennial-millennial scale rapid climatic events. Unlike the widely acknowledged 8.2 ka climate anomaly, the likelihood of a significant climate excursion at around 9.2 cal ka BP, which has been notably recognized in some studies, remains to be fully clarified in terms of its magnitude and intensity, as well as its characteristics and spatial distributions in a range of paleoclimatic records. In this study, a peat sediment profile from the Dajiuhu Basin in central China was collected with several geochemical proxies and a pollen analysis carried out to help improve understanding of the climate changes around 9.2 cal ka BP. The results show that the peat development was interrupted abruptly at around 9.2 cal ka BP, when the chemical weathering strength decreased and the tree-pollen declined. This suggests that a strong drier regional climatic event occurred at around 9.2 cal ka BP in central China, which was, in turn, probably connected to the rapid 9.2 ka climate event co-developing worldwide. In addition, based on the synthesis of our peat records and the other Holocene hydrological records from Asian summer monsoon (ASM) region, we further found that the 9.2 ka event probably constituted the strongest abrupt collapse of the Asian monsoon system during the full Holocene interval. The correlations between ASM and the atmospheric 14C production rate, the North Atlantic drift ice records and Greenland temperature indicated that the weakened ASM event at around 9.2 cal ka BP could be interpreted by the co-influence of external and internal factors, related to the changes of the solar activity and the Atlantic Meridional Overturning Circulation (AMOC).

  9. Extreme Flood Events Over the Past 300 Years Recorded in the Sediments of a Mountain Lake in the Altay Mountains, Northwestern China

    NASA Astrophysics Data System (ADS)

    Wu, J.; Zhou, J.; Shen, B.; Zeng, H.

    2017-12-01

    Global climate change has the potential to accelerate the hydrological cycle, which may further enhance the temporal frequency of regional extreme floods. Climatic models predict that intra-annual rainfall variability will intensify, which will shift current rainfall regimes towards more extreme systems with lower precipitation frequencies, longer dry periods, and larger individual precipitation events worldwide. Understanding the temporal variations of extreme floods that occur in response to climate change is essential to anticipate the trends in flood magnitude and frequency in the context of global warming. However, currently available instrumental data are not long enough for capturing the most extreme events, thus the acquisition of long duration datasets for historical floods that extend beyond available instrumental records is clearly an important step in discerning trends in flood frequency and magnitude with respect to climate change. In this study, a reconstruction of paleofloods over the past 300 years was conducted through an analysis of grain sizes from the sediments of Kanas Lake in the Altay Mountains of northwestern China. Grain parameters and frequency distributions both demonstrate that two abrupt environment changes exist within the lake sedimentary sequence. Based on canonical discriminant analysis (CDA) and C-M pattern analysis, two flood events corresponding to ca. 1760 AD and ca. 1890 AD were identified, both of which occurred during warmer and wetter climate conditions according to tree-ring records. These two flood events are also evidenced by lake sedimentary records in the Altay and Tianshan areas. Furthermore, through a comparison with other records, the flood event in ca. 1760 AD seems to have occurred in both the arid central Asia and the Alps in Europe, and thus may have been associated with changes in the North Atlantic Oscillation (NAO) index.

  10. Cognitive Complexity of the Medical Record Is a Risk Factor for Major Adverse Events

    PubMed Central

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Context: Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood because “patient complexity” has been difficult to quantify. Objective: We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. Design: The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. Main Outcome Measures: The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Results: Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. Conclusions: CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time. PMID:24626065

  11. Deaths in natural hazards in the solomon islands.

    PubMed

    Blong, R J; Radford, D A

    1993-03-01

    Archival and library search techniques have been used to establish extensive databases on deaths and damage resulting from natural hazards in the Solomon Islands. Although the records of fatalities are certainly incomplete, volcanic eruptions, tropical cyclones, landslides, tsunami and earthquakes appear to have been the most important. Only 22 per cent of the recorded deaths have resulted from meteorological hazards but a single event could change this proportion significantly. Five events in the fatality database account for 88 per cent of the recorded deaths. Future death tolls are also likely to be dominated by a small number of events. While the expected number of deaths in a given period is dependent upon the length of record considered, it is clear that a disaster which kills one hundred or more people in the Solomons can be expected more frequently than once in a hundred years.

  12. [Demonstrating patient safety requires acceptance of a broader scientific palette].

    PubMed

    Leistikow, I

    2017-01-01

    It is high time the medical community recognised that patient-safety research can be assessed using other scientific methods than the traditional medical ones. There is often a fundamental mismatch between the methodology of patient-safety research and the methodology used to assess the quality of this research. One example is research into the reliability and validity of record review as a method for detecting adverse events. This type of research is based on logical positivism, while record review itself is based on social constructivism. Record review does not lead to "one truth": adverse events are not measured on the basis of the records themselves, but by weighing the probability of certain situations being classifiable as adverse events. Healthcare should welcome behavioural and social sciences to its scientific palette. Restricting ourselves to the randomised control trial paradigm is short-sighted and dangerous; it deprives patients of much-needed improvements in safety.

  13. Atmospheric CO2 and abrupt climate change on submillennial timescales

    NASA Astrophysics Data System (ADS)

    Ahn, Jinho; Brook, Edward

    2010-05-01

    How atmospheric CO2 varies and is controlled on various time scales and under various boundary conditions is important for understanding how the carbon cycle and climate change are linked. Ancient air preserved in ice cores provides important information on past variations in atmospheric CO2. In particular, concentration records for intervals of abrupt climate change may improve understanding of mechanisms that govern atmospheric CO2. We present new multi-decadal CO2 records that cover Greenland stadial 9 (between Dansgaard-Oeschger (DO) events 8 and 9) and the abrupt cooling event at 8.2 ka. The CO2 records come from Antarctic ice cores but are well synchronized with Greenland ice core records using new high-resolution CH4 records,precisely defining the timing of CO2 change with respect to abrupt climate events in Greenland. Previous work showed that during stadial 9 (40~38 ka), CO2 rose by about 15~20 ppm over around 2,000 years, and at the same time temperatures in Antarctica increased. Dust proxies indicate a decrease in dust flux over the same period. With more detailed data and better age controls we now find that approximately half of the CO2 increase during stadial 9 occurred abruptly, over the course of decades to a century at ~39.6 ka. The step increase of CO2 is synchronous with a similar step increase of Antarctic isotopic temperature and a small abrupt change in CH4, and lags after the onset of decrease in dust flux by ~400 years. New atmospheric CO2 records at the well-known ~8.2 ka cooling event were obtained from Siple Dome ice core, Antarctica. Our preliminary CO2 data span 900 years and include 19 data points within the 8.2 ka cooling event, which persisted for ~160 years (Thomas et al., Quarternary Sci. Rev., 2007). We find that CO2 increased by 2~4 ppm during that cooling event. Further analyses will improve the resolution and better constrain the CO2 variability during other times in the early Holocene to determine if the variations observed during at 8.2 ka event are significant.

  14. mSpray: a mobile phone technology to improve malaria control efforts and monitor human exposure to malaria control pesticides in Limpopo, South Africa.

    PubMed

    Eskenazi, Brenda; Quirós-Alcalá, Lesliam; Lipsitt, Jonah M; Wu, Lemuel D; Kruger, Philip; Ntimbane, Tzundzukani; Nawn, John Burns; Bornman, M S Riana; Seto, Edmund

    2014-07-01

    Recent estimates indicate that malaria has led to over half a million deaths worldwide, mostly to African children. Indoor residual spraying (IRS) of insecticides is one of the primary vector control interventions. However, current reporting systems do not obtain precise location of IRS events in relation to malaria cases, which poses challenges for effective and efficient malaria control. This information is also critical to avoid unnecessary human exposure to IRS insecticides. We developed and piloted a mobile-based application (mSpray) to collect comprehensive information on IRS spray events. We assessed the utility, acceptability and feasibility of using mSpray to gather improved homestead- and chemical-level IRS coverage data. We installed mSpray on 10 cell phones with data bundles, and pilot tested it with 13 users in Limpopo, South Africa. Users completed basic information (number of rooms/shelters sprayed; chemical used, etc.) on spray events. Upon submission, this information as well as geographic positioning system coordinates and time/date stamp were uploaded to a Google Drive Spreadsheet to be viewed in real time. We administered questionnaires, conducted focus groups, and interviewed key informants to evaluate the utility of the app. The low-cost, cell phone-based "mSpray" app was learned quickly by users, well accepted and preferred to the current paper-based method. We recorded 2865 entries (99.1% had a GPS accuracy of 20 m or less) and identified areas of improvement including increased battery life. We also identified a number of logistic and user problems (e.g., cost of cell phones and cellular bundles, battery life, obtaining accurate GPS measures, user errors, etc.) that would need to be overcome before full deployment. Use of cell phone technology could increase the efficiency of IRS malaria control efforts by mapping spray events in relation to malaria cases, resulting in more judicious use of chemicals that are potentially harmful to humans and the environment. Copyright © 2014. Published by Elsevier Ltd.

  15. Utility of Sleep Stage Transitions in Assessing Sleep Continuity

    PubMed Central

    Laffan, Alison; Caffo, Brian; Swihart, Bruce J.; Punjabi, Naresh M.

    2010-01-01

    Study Objectives: Sleep continuity is commonly assessed with polysomnographic measures such as sleep efficiency, sleep stage percentages, and the arousal index. The aim of this study was to examine whether the transition rate between different sleep stages could be used as an index of sleep continuity to predict self-reported sleep quality independent of other commonly used metrics. Design and Setting: Analysis of the Sleep Heart Health Study polysomnographic data. Participants: A community cohort. Measurements and Results: Sleep recordings on 5,684 participants were deemed to be of sufficient quality to allow visual scoring of NREM and REM sleep. For each participant, we tabulated the frequency of transitions between wake, NREM sleep, and REM sleep. An overall transition rate was determined as the number of all transitions per hour sleep. Stage-specific transition rates between wake, NREM sleep, and REM sleep were also determined. A 5-point Likert scale was used to assess the subjective experience of restless and light sleep the morning after the sleep study. Multivariable regression models showed that a high overall sleep stage transition rate was associated with restless and light sleep independent of several covariates including total sleep time, percentages of sleep stages, wake time after sleep onset, and the arousal index. Compared to the lowest quartile of the overall transition rate (< 7.76 events/h), the odds ratios for restless sleep were 1.27, 1.42, and 1.38, for the second (7.77–10.10 events/h), third (10.11–13.34 events/h), and fourth (≥ 13.35 events/h) quartiles, respectively. Analysis of stage-specific transition rates showed that transitions between wake and NREM sleep were also independently associated with restless and light sleep. Conclusions: Assessing overall and stage-specific transition rates provides a complementary approach for assessing sleep continuity. Incorporating such measures, along with conventional metrics, could yield useful insights into the significance of sleep continuity for clinical outcomes. Citation: Laffan A; Caffo B; Swihart BJ; Punjabi NM. Utility of sleep stage transitions in assessing sleep continuity. SLEEP 2010;33(12):1681-1686. PMID:21120130

  16. Method and apparatus for detecting and determining event characteristics with reduced data collection

    NASA Technical Reports Server (NTRS)

    Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)

    2007-01-01

    A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.

  17. South Australian historical earthquakes in the pre-instrumental period 1837-1963: A comprehensive chronicle and analysis of available intensity data

    NASA Astrophysics Data System (ADS)

    Dix, Katherine; Greenhalgh, Stewart

    2014-05-01

    Macroseismic data in the form of felt reports of earthquake shaking is vital to seismic hazard assessment, especially in view of the relatively short period of instrumental recording in many countries. During the early 1990s, we conducted a very detailed examination of historical earthquake records held in the State Government archives and the Public Library (newspaper accounts) of South Australia. This work resulted in the compilation of a list of just over 460 earthquakes in the period prior to seismic network recording, which commenced in 1963. A single Milne (and later Milne-Shaw) seismograph had been operated in Adelaide from 1908 to 1948 to record worldwide events but it was not suitable for studying local seismic activity. The majority of the historical events uncovered had escaped mention in any previous publications on South Australian seismicity and seismic risk. This historical earthquake research, including the production of a large number of isoseismal maps to enable quantification in terms of magnitude and location, appears to have been the only study of its kind in South Australia performed so comprehensively, and resulted in the most extensive list available. After 20 years, it still stands as the definitive list of historical earthquake events in the State. The incorporation of these additional historical events into the South Australian Earthquake Catalogue maintained by the SA Department of Primary Industries and Resources had the potential to raise the previous listing of just 49 pre-instrumental events to 511 earthquakes, and to extend the record back another 46 years to 1837, the date the colony of South Australia was proclaimed. Some of the major events have been formally included in the South Australian Earthquake Catalogue. However, for many events there was insufficient information and/or time to finalize the source parameters due to the onerous task of manually trawling through historical records and newspapers for felt reports. With the advent of the information age, researching historical newspapers and records is now a feasible undertaking, although such accounts are biased by the population distribution and the history of newspaper operations in the emerging colony. To provide an example of what is possible, we recovered reports of an additional 110 previously unrecognized earthquakes during the first 50 years since European settlement of South Australia, from digitized SA newspapers recently made available on the National Library of Australia's website called TROVE. This was done in a relatively short period of time and now the South Australian Historical Earthquake List incorporating these events comprises some 679 entries. This research builds upon and consolidates the work that was commenced 20 years ago. By doing so, it proposes the establishment of flexible and convenient computerized processes to maintain well into the future an increasingly accurate and complete record of historical earthquakes in South Australia. This work may also provide a model for the ongoing development of historical earthquake records in other states and territories of Australia.

  18. The Southern Oscillation recorded in the δ18O of corals from Tarawa Atoll

    NASA Astrophysics Data System (ADS)

    Cole, Julia E.; Fairbanks, Richard G.

    1990-10-01

    In the western equatorial Pacific, the El Niño/Southern Oscillation (ENSO) phenomenon is characterized by precipitation variability associated with the migration of the Indonesian low pressure cell to the region of the date line and the equator. During ENSO events, Tarawa Atoll (1°N, 172°E) experiences heavy rainfall which has an estimated δ18O of about -8 to -10‰ δ18OSMOW. At Tarawa, sufficient precipitation of this composition falls during ENSO events to alter the δ18O and the salinity of the surface waters. Oxygen isotope records from two corals collected off the reef crest of Tarawa reflect rainfall variations associated with both weak and strong ENSO conditions, with approximately monthly resolution. Coral skeletal δ18O variations due to small sea surface temperature (SST) changes are secondary. These records demonstrate the remarkable ability of this technique to reconstruct variations in the position of the Indonesian Low from coral δ18O records in the western equatorial Pacific, a region which has few paleoclimatic records. The coral isotopic data correctly resolve the relative magnitudes of recent variations in the Southern Oscillation Index. Combining the Tarawa record with an oxygen isotopic history from a Galápagos Islands coral demonstrates the ability to distinguish the meteorologic (precipitation) and oceanographic (SST) anomalies that characterize ENSO events across the Pacific Basin over the period of common record (1960-1979). Comparison of the intensity of climatic anomalies at these two sites yields insight into the spatial variability of ENSO events. Isotope records from older corals can provide high-resolution, Pacific-wide reconstructions of ENSO behavior during periods of different climate boundary conditions.

  19. An Epidemiological Network Model for Disease Outbreak Detection

    PubMed Central

    Reis, Ben Y; Kohane, Isaac S; Mandl, Kenneth D

    2007-01-01

    Background Advanced disease-surveillance systems have been deployed worldwide to provide early detection of infectious disease outbreaks and bioterrorist attacks. New methods that improve the overall detection capabilities of these systems can have a broad practical impact. Furthermore, most current generation surveillance systems are vulnerable to dramatic and unpredictable shifts in the health-care data that they monitor. These shifts can occur during major public events, such as the Olympics, as a result of population surges and public closures. Shifts can also occur during epidemics and pandemics as a result of quarantines, the worried-well flooding emergency departments or, conversely, the public staying away from hospitals for fear of nosocomial infection. Most surveillance systems are not robust to such shifts in health-care utilization, either because they do not adjust baselines and alert-thresholds to new utilization levels, or because the utilization shifts themselves may trigger an alarm. As a result, public-health crises and major public events threaten to undermine health-surveillance systems at the very times they are needed most. Methods and Findings To address this challenge, we introduce a class of epidemiological network models that monitor the relationships among different health-care data streams instead of monitoring the data streams themselves. By extracting the extra information present in the relationships between the data streams, these models have the potential to improve the detection capabilities of a system. Furthermore, the models' relational nature has the potential to increase a system's robustness to unpredictable baseline shifts. We implemented these models and evaluated their effectiveness using historical emergency department data from five hospitals in a single metropolitan area, recorded over a period of 4.5 y by the Automated Epidemiological Geotemporal Integrated Surveillance real-time public health–surveillance system, developed by the Children's Hospital Informatics Program at the Harvard-MIT Division of Health Sciences and Technology on behalf of the Massachusetts Department of Public Health. We performed experiments with semi-synthetic outbreaks of different magnitudes and simulated baseline shifts of different types and magnitudes. The results show that the network models provide better detection of localized outbreaks, and greater robustness to unpredictable shifts than a reference time-series modeling approach. Conclusions The integrated network models of epidemiological data streams and their interrelationships have the potential to improve current surveillance efforts, providing better localized outbreak detection under normal circumstances, as well as more robust performance in the face of shifts in health-care utilization during epidemics and major public events. PMID:17593895

  20. The Linked System Project : a network interconnection project between three major bibliographic utilities and LC

    NASA Astrophysics Data System (ADS)

    Kurihara, Shin'ichi

    The Linked Systems Project (LSP) is the first network project based on the Open Systems Interconnection (OSI) in the world. The purpose of the project is to interconnect between three major bibliographic utilities and LC, and to perform as one system on the whole. The first application developed for the LSP is the sharing of name authority data based on the Name Authority Cooperative (NACO) Project. In 1985, LC began to send name authority records to RLG/RLIN. Since 1987, RLG/RLIN and OCLC send name authority records to LC. Bibliographic records will be sent mutually between three major bibliographic utilities and LC near future.

  1. A composite pollen-based stratotype for inter-regional evaluation of climatic events in New Zealand over the past 30,000 years (NZ-INTIMATE project)

    NASA Astrophysics Data System (ADS)

    Barrell, David J. A.; Almond, Peter C.; Vandergoes, Marcus J.; Lowe, David J.; Newnham, Rewi M.

    2013-08-01

    Our review of paleoclimate information for New Zealand pertaining to the past 30,000 years has identified a general sequence of climatic events, spanning the onset of cold conditions marking the final phase of the Last Glaciation, through to the emergence to full interglacial conditions in the early Holocene. In order to facilitate more detailed assessments of climate variability and any leads or lags in the timing of climate changes across the region, a composite stratotype is proposed for New Zealand. The stratotype is based on terrestrial stratigraphic records and is intended to provide a standard reference for the intercomparison and evaluation of climate proxy records. We nominate a specific stratigraphic type record for each climatic event, using either natural exposure or drill core stratigraphic sections. Type records were selected on the basis of having very good numerical age control and a clear proxy record. In all cases the main proxy of the type record is subfossil pollen. The type record for the period from ca 30 to ca 18 calendar kiloyears BP (cal. ka BP) is designated in lake-bed sediments from a small morainic kettle lake (Galway tarn) in western South Island. The Galway tarn type record spans a period of full glacial conditions (Last Glacial Coldest Period, LGCP) within the Otira Glaciation, and includes three cold stadials separated by two cool interstadials. The type record for the emergence from glacial conditions following the termination of the Last Glaciation (post-Termination amelioration) is in a core of lake sediments from a maar (Pukaki volcanic crater) in Auckland, northern North Island, and spans from ca 18 to 15.64 ± 0.41 cal. ka BP. The type record for the Lateglacial period is an exposure of interbedded peat and mud at montane Kaipo bog, eastern North Island. In this high-resolution type record, an initial mild period was succeeded at 13.74 ± 0.13 cal. ka BP by a cooler period, which after 12.55 ± 0.14 cal. ka BP gave way to a progressive ascent to full interglacial conditions that were achieved by 11.88 ± 0.18 cal. ka BP. Although a type section is not formally designated for the Holocene Interglacial (11.88 ± 0.18 cal. ka BP to the present day), the sedimentary record of Lake Maratoto on the Waikato lowlands, northwestern North Island, is identified as a prospective type section pending the integration and updating of existing stratigraphic and proxy datasets, and age models. The type records are interconnected by one or more dated tephra layers, the ages of which are derived from Bayesian depositional modelling and OxCal-based calibrations using the IntCal09 dataset. Along with the type sections and the Lake Maratoto record, important, well-dated terrestrial reference records are provided for each climate event. Climate proxies from these reference records include pollen flora, stable isotopes from speleothems, beetle and chironomid fauna, and glacier moraines. The regional composite stratotype provides a benchmark against which to compare other records and proxies. Based on the composite stratotype, we provide an updated climate event stratigraphic classification for the New Zealand region. The stratotype and event classification are not intended to act as definitive statements of paleoclimate history for the New Zealand region, but rather provide a firm baseline against which to compare other records including those from the marine realm.

  2. A Retrospective Analysis of Corticosteroid Utilization Before Initiation of Biologic DMARDs Among Patients with Rheumatoid Arthritis in the United States.

    PubMed

    Spivey, Christina A; Griffith, Jenny; Kaplan, Cameron; Postlethwaite, Arnold; Ganguli, Arijit; Wang, Junling

    2018-06-01

    Understanding the effects of corticosteroid utilization prior to initiation of biologic disease-modifying antirheumatic drugs (DMARDs) can inform decision-makers on the appropriate use of these medications. This study examined treatment patterns and associated burden of corticosteroid utilization before initiation of biologic DMARDs among rheumatoid arthritis (RA) patients. A retrospective analysis was conducted of adult RA patients in the US MarketScan Database (2011-2015). The following patterns of corticosteroid utilization were analyzed: whether corticosteroids were used; duration of use (short/long duration defined as < or ≥ 3 months); and dosage (low as < 2.5, medium as 2.5 to < 7.5 and high as ≥ 7.5 mg/day). Effects of corticosteroid use on time to biologic DMARD initiation were examined using Cox proportional hazards models. Likelihood and number of adverse events were examined using logistic and negative binomial regression models. Generalized linear models were used to examine healthcare costs. Independent variables in all models included patient demographics and health characteristics. A total of 25,542 patients were included (40.84% used corticosteroids). Lower hazard of biologic DMARD initiation was associated with corticosteroid use (hazard ratio = 0.89, 95% confidence interval = 0.83-0.96), long duration and lower dose. Corticosteroid users compared to non-users had higher incidence rates of various adverse events including cardiovascular events (P < 0.05). Higher likelihood of adverse events was associated with corticosteroid use and long duration of use, as was increased number of adverse events. Corticosteroid users had a greater annualized mean number of physician visits, hospitalizations, and emergency department (ED) visits than non-users in adjusted analysis. Corticosteroid users compared to non-users had higher mean costs for total healthcare, physician visits, hospitalizations, and ED visits. Among patients with RA, corticosteroid utilization is associated with delayed initiation of biologic DMARDS and higher burden of adverse events and healthcare utilization/costs before the initiation of biologic DMARDs. AbbVie Inc.

  3. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high-velocity lithospheric slab. In application, JHD has the practical advantage that it does not require the specification of a theoretical velocity model for the slab. Considering earthquakes within a 260 km long by 60 km wide section of the Aleutian main thrust zone, our results suggest that the theoretical velocity structure of the slab is presently not sufficiently well known that accurate locations can be obtained independently of locally recorded data. Using a locally recorded earthquake as a calibration event, JHD gave excellent results over the entire section of the main thrust zone here studied, without showing a strong effect that might be attributed to spatially varying source-station anomalies. We also calibrated the ray-tracing method using locally recorded data and obtained results generally similar to those obtained by JHD. ?? 1982.

  4. Characterising Record Flooding in the United Kingdom

    NASA Astrophysics Data System (ADS)

    Cox, A.; Bates, P. D.; Smith, J. A.

    2017-12-01

    Though the most notable floods in history have been carefully explained, there remains a lack of literature that explores the nature of record floods as a whole in the United Kingdom. We characterise the seasonality, statistical and spatial distribution, and meteorological causes of peak river flows for 521 gauging stations spread across the British Isles. We use annual maximum data from the National River Flow Archive, catchment descriptors from the Flood Estimation Handbook, and historical records of large floods. What we aim to find is in what ways, if any, the record flood for a station is different from more 'typical' floods. For each station, we calculate two indices: the seasonal anomaly and the flood index. Broadly, the seasonal anomaly is the degree to which a station's record flood happens at a different time of year compared to typical floods at that site, whilst the flood index is a station's record flood discharge divided by the discharge of the 1-in-10-year return period event. We find that while annual maximum peaks are dominated by winter frontal rainfall, record floods are disproportionately caused by summer convective rainfall. This analysis also shows that the larger the seasonal anomaly, the higher the flood index. Additionally, stations across the country have record floods that occur in the summer with no notable spatial pattern, yet the most seasonally anomalous record events are concentrated around the south and west of the British Isles. Catchment descriptors tell us little about the flood index at a particular station, but generally areas with lower mean annual precipitation have a higher flood index. The inclusion of case studies from recent and historical examples of notable floods across the UK supplements our analysis and gives insight into how typical these events are, both statistically and meteorologically. Ultimately, record floods in general happen at relatively unexpected times and with unpredictable magnitudes, which is a worrying reality for those who live in flood-prone areas, and to those who study the upper tail of flood events.

  5. Automatic recovery of aftershock sequences at the International Data Centre: from concept to pipeline

    NASA Astrophysics Data System (ADS)

    Kitov, I.; Bobrov, D.; Rozhkov, M.

    2016-12-01

    Aftershocks of larger earthquakes represent an important source of information on the distribution and evolution of stresses and deformations in pre-seismic, co-seismic and post-seismic phases. For the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO) largest aftershocks sequences are also a challenge for automatic and interactive processing. The highest rate of events recorded by two and more seismic stations of the International Monitoring System from a relatively small aftershock area may reach hundreds per hour (e.g. Sumatra 2004 and Tohoku 2011). Moreover, there are thousands of reflected/refracted phases per hour with azimuth and slowness within the uncertainty limits of the first P-waves. Misassociation of these later phases, both regular and site specific, as the first P-wave results in creation of numerous wrong event hypotheses in automatic IDC pipeline. In turn, interactive review of such wrong hypotheses is direct waste of analysts' resources. Waveform cross correlation (WCC) is a powerful tool to separate coda phases from actual P-wave arrivals and to fully utilize the repeat character of waveforms generated by events close in space. Array seismic stations of the IMS enhance the performance of the WCC in two important aspects - they reduce detection threshold and effectively suppress arrivals from all sources except master events. An IDC specific aftershock tool has been developed and merged with standard IDC pipeline. The tool includes several procedures: creation of master events consisting of waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point of interactive analysis. Since global monitoring of underground nuclear tests is based on historical and synthetic data, each aftershock sequence can be tested for the CTBT violation with big earthquakes as an evasion scenario.

  6. Adverse events among Ontario home care clients associated with emergency room visit or hospitalization: a retrospective cohort study

    PubMed Central

    2013-01-01

    Background Home care (HC) is a critical component of the ongoing restructuring of healthcare in Canada. It impacts three dimensions of healthcare delivery: primary healthcare, chronic disease management, and aging at home strategies. The purpose of our study is to investigate a significant safety dimension of HC, the occurrence of adverse events and their related outcomes. The study reports on the incidence of HC adverse events, the magnitude of the events, the types of events that occur, and the consequences experienced by HC clients in the province of Ontario. Methods A retrospective cohort design was used, utilizing comprehensive secondary databases available for Ontario HC clients from the years 2008 and 2009. The data were derived from the Canadian Home Care Reporting System, the Hospital Discharge Abstract Database, the National Ambulatory Care Reporting System, the Ontario Mental Health Reporting System, and the Continuing Care Reporting System. Descriptive analysis was used to identify the type and frequency of the adverse events recorded and the consequences of the events. Logistic regression analysis was used to examine the association between the events and their consequences. Results The study found that the incident rate for adverse events for the HC clients included in the cohort was 13%. The most frequent adverse events identified in the databases were injurious falls, injuries from other than a fall, and medication-related incidents. With respect to outcomes, we determined that an injurious fall was associated with a significant increase in the odds of a client requiring long-term-care facility admission and of client death. We further determined that three types of events, delirium, sepsis, and medication-related incidents were associated directly with an increase in the odds of client death. Conclusions Our study concludes that 13% of clients in homecare experience an adverse event annually. We also determined that an injurious fall was the most frequent of the adverse events and was associated with increased admission to long-term care or death. We recommend the use of tools that are presently available in Canada, such as the Resident Assessment Instrument and its Clinical Assessment Protocols, for assessing and mitigating the risk of an adverse event occurring. PMID:23800280

  7. Optical recording of action potentials and other discrete physiological events: a perspective from signal detection theory.

    PubMed

    Sjulson, Lucas; Miesenböck, Gero

    2007-02-01

    Optical imaging of physiological events in real time can yield insights into biological function that would be difficult to obtain by other experimental means. However, the detection of all-or-none events, such as action potentials or vesicle fusion events, in noisy single-trial data often requires a careful balance of tradeoffs. The analysis of such experiments, as well as the design of optical reporters and instrumentation for them, is aided by an understanding of the principles of signal detection. This review illustrates these principles, using as an example action potential recording with optical voltage reporters.

  8. Dating the Vostok ice core record by importing the Devils Hole chronology

    USGS Publications Warehouse

    Landwehr, J.M.; Winograd, I.J.

    2001-01-01

    The development of an accurate chronology for the Vostok record continues to be an open research question because these invaluable ice cores cannot be dated directly. Depth-to-age relationships have been developed using many different approaches, but published age estimates are inconsistent, even for major paleoclimatic events. We have developed a chronology for the Vostok deuterium paleotemperature record using a simple and objective algorithm to transfer ages of major paleoclimatic events from the radiometrically dated 500,000-year ??18O-paleotemperature record from Devils Hole, Nevada. The method is based only on a strong inference that major shifts in paleotemperature recorded at both locations occurred synchronously, consistent with an atmospheric teleconnection. The derived depth-to-age relationship conforms with the physics of ice compaction, and internally produces ages for climatic events 5.4 and 11.24 which are consistent with the externally assigned ages that the Vostok team needed to assume in order to derive their most recent chronology, GT4. Indeed, the resulting V-DH chronology is highly correlated with GT4 because of the unexpected correspondence even in the timing of second-order climatic events that were not constrained by the algorithm. Furthermore, the algorithm developed herein is not specific to this problem; rather, the procedure can be used whenever two paleoclimate records are proxies for the same physical phenomenon, and paleoclimatic conditions forcing the two records can be considered to have occurred contemporaneously. The ability of the algorithm to date the East Antarctic Dome Fuji core is also demonstrated.

  9. An annually resolved marine proxy record for the 8.2K cold event from the northern North Sea based on bivalve shells

    NASA Astrophysics Data System (ADS)

    Butler, Paul; Estrella-Martínez, Juan; Scourse, James

    2017-04-01

    The so-called 8.2K cold event is a rapid cooling of about 6° +/- 2° recorded in the Greenland ice core record and thought to be a consequence of a freshwater pulse from the Laurentide ice sheet which reduced deepwater formation in the North Atlantic. In the Greenland ice cores the event is characterized by a maximum extent of 159 years and a central event lasting for 70 years. As discussed by Thomas et al (QSR, 2007), the low resolution and dating uncertainty of much palaeoclimate data makes it difficult to determine the rates of change and causal sequence that characterise the event at different locations. We present here a bivalve shell chronology based on four shells of Arctica islandica from the northern North Sea which (within radiocarbon uncertainty) is coeval with the 8.2K event recorded in the Greenland ice cores. The years of death of each shell based on radiocarbon analysis and crossmatching are 8094, 8134, 8147, and 8208 yrs BP (where "present" = AD 1950), with an associated radiocarbon uncertainty of +/-80 yrs, and their longevities are 106, 122, 112 and 79 years respectively. The total length of the chronology is 192 years (8286 - 8094 BP +/- 80 yrs). The most noticeable feature of the chronology is an 60-year period of increasing growth which may correspond to a similar period of decreasing ice accumulation in the GRIP (central Greenland) ice core record. We tentatively suggest that this reflects increasing food supply to the benthos as summer stratification is weakened by colder seawater temperatures. Stable isotope analyses (results expected to be available when this abstract is presented), will show changes at annual and seasonal resolution, potentially giving a very detailed insight into the causal factors associated with the 8.2K event and its impact in the northern North Sea.

  10. Investigation into the risk perceptions of investors in the securities of nuclear-dependent electric utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spudeck, R.E.

    1983-01-01

    Two weeks prior to the Three Mile Island accident, March 15, 1979, the Nuclear Regulatory Commission ordered five operating nuclear plants shut down in order to reexamine safety standards in these plants. Reports in the popular and trade press during this time suggested that these events, particularly the accident at Three Mile Island, caused investors in the securities of electric utilities that had nuclear-generation facilities to revise their risk perceptions. This study was designed to examine the impact of both the Nuclear Regulatory Commission order and the accident at Three Mile Island on investor risk perceptions. Selected categories of electricmore » utilities were chosen to examine any differential risk effects resulting from these events. An asset pricing model devoid of many of the restrictive assumptions of more familiar models was used to model investor behavior. The findings suggest that the events described did cause investors to revise upward their perceptions of systematic risk regarding different categories of electric utilities. More specifically, those electric utilities that were operating nuclear plants in 1979 experienced the largest and most sustained increase in systematic risk. However, electric utilities that in 1979 had no operating nuclear plants, but had planned and committed funds for nuclear plants in the future, also experienced increases in systematic risk.« less

  11. Assessment of Pharmacy Information System Performance in Three Hospitals in Eastern Province, Saudi Arabia

    PubMed Central

    El.Mahalli, Azza; El-Khafif, Sahar H.; Yamani, Wid

    2016-01-01

    The pharmacy information system is one of the central pillars of a hospital information system. This research evaluated a pharmacy information system according to six aspects of the medication process in three hospitals in Eastern Province, Saudi Arabia. System administrators were interviewed to determine availability of functionalities. Then, system users within the hospital were targeted to evaluate their level of usage of these functionalities. The study was cross-sectional. Two structured surveys were designed. The overall response rate of hospital users was 31.7 percent. In all three hospitals studied, the electronic health record is hybrid, implementation has been completed and the system is running, and the systems have computerized provider order entry and clinical decision support. Also, the pharmacy information systems are integrated with the electronic health record, and computerized provider order entry and almost all prescribing and transcription functionalities are available; however, drug dispensing is a mostly manual process. However, the study hospitals do not use barcode-assisted medication administration systems to verify patient identity and electronically check dose administration, and none of them have computerized adverse drug event monitoring that uses the electronic health record. The numbers of users who used different functionalities most or all of the time was generally low. The highest frequency of utilization was for patient administration records (56.8 percent), and the lowest was for linkage of the pharmacy information system to pharmacy stock (9.1 percent). Encouraging users to use different functionalities was highly recommended. PMID:26903780

  12. Assessment of Pharmacy Information System Performance in Three Hospitals in Eastern Province, Saudi Arabia.

    PubMed

    El Mahalli, Azza; El-Khafif, Sahar H; Yamani, Wid

    2016-01-01

    The pharmacy information system is one of the central pillars of a hospital information system. This research evaluated a pharmacy information system according to six aspects of the medication process in three hospitals in Eastern Province, Saudi Arabia. System administrators were interviewed to determine availability of functionalities. Then, system users within the hospital were targeted to evaluate their level of usage of these functionalities. The study was cross-sectional. Two structured surveys were designed. The overall response rate of hospital users was 31.7 percent. In all three hospitals studied, the electronic health record is hybrid, implementation has been completed and the system is running, and the systems have computerized provider order entry and clinical decision support. Also, the pharmacy information systems are integrated with the electronic health record, and computerized provider order entry and almost all prescribing and transcription functionalities are available; however, drug dispensing is a mostly manual process. However, the study hospitals do not use barcode-assisted medication administration systems to verify patient identity and electronically check dose administration, and none of them have computerized adverse drug event monitoring that uses the electronic health record. The numbers of users who used different functionalities most or all of the time was generally low. The highest frequency of utilization was for patient administration records (56.8 percent), and the lowest was for linkage of the pharmacy information system to pharmacy stock (9.1 percent). Encouraging users to use different functionalities was highly recommended.

  13. Source mechanism characterization and integrated interpretation of microseismic data monitoring two hydraulic stimulations in pouce coupe field, Alberta

    NASA Astrophysics Data System (ADS)

    Lindholm, Garrison J.

    The study of the Pouce Coupe Field is a joint effort between the Reservoir Characterization Project (RCP) and Talisman Energy Inc. My study focuses on the hydraulic stimulation of two horizontal wells within the Montney Formation located in north-western Alberta. The Montney is an example of a modern-day tight, engineering-driven play in which recent advances in drilling of horizontal wells and hydraulic fracturing have made shale gas exploitation economical. The wells were completed in December 2008 and were part of a science driven project in which a multitude of data were collected including multicomponent seismic, microseismic, and production logs. Since this time, a number of studies have been performed by students at Colorado School of Mines to better understand the effects the completions have had on the reservoir. This thesis utilizes the microseismic data that were recorded during the stimulation of the two horizontal wells in order to understand the origin of the microseismic events themselves. The data are then used to understand and correlate to the well production. To gain insight into the source of the microseismic events, amplitude ratios of recorded seismic modes (P, Sh and Sv) for the microseismic events are studied. By fitting trends of simple end member source mechanisms (strike-slip, dip-slip, and tensile) to groups of amplitude ratio data, the events are found to be of strike-slip nature. By comparing the focal mechanisms to other independent natural fracture determination techniques (shear-wave splitting analysis, FMI log), it is shown that the source of recorded microseismic events is likely to be a portion of the shear slip along existing weak planes (fractures) within a reservoir. The technique described in this work is one that is occasionally but increasingly used but offers the opportunity to draw further information from microseismic data using results that are already part of a typical processing workflow. The microseismic events are then used as a tool to analyze the effectiveness of the hydraulic stimulation and why production varies on a well and stage basis. The study shows that production disparities may be related to communication between horizontal wells, a potential weak zone of sub-seismic scale faults/fractures at the toe of one of the completed horizontal wells, and most importantly the quality of the stimulated rock. The results suggest that the quality of the stimulated reservoir rock is a greater driver of production than the total stimulated volume. By integrating the microseismic with other data (seismic and engineering), this work shows that the key to the understanding of these engineering-driven plays is an integrated solution. The methods shown in this thesis are applicable to many similar plays across North America and the world. The complicated nature of these tight reservoirs underscores the need for effective well planning, placement, and stimulation for economical development of shale resource plays.

  14. 77 FR 67348 - Privacy Act of 1974; System of Records-Alternative Dispute Resolution (ADR) Center Case Tracking...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... J2EE application that is platform independent and captures all information relating to Alternative Dispute Resolution case processing. It tracks, manages, and reports on all data, events, and procedures... records to indicate that it will be used: (1) To track, manage, and report on all data, events, and...

  15. Deployed Analyst Handbook

    DTIC Science & Technology

    2016-06-01

    chapter parallels information from the Institute for Operations Research and the Management Sciences (INFORMS) on communicating the ORSA profession...mathematics courses at the United States Military Academy (USMA), Naval Postgraduate School (NPS), Air Force Institute of Technology or a TRADOC School...event location, event category, etc. One should compare these audited records with the existing record and include this information in the

  16. New records with examples of potential host colonization events for hypopi (Acari: Hypoderatidae) from birds

    USGS Publications Warehouse

    Pence, Danny B.; Spalding, M.G.; Bergan, J.F.; Cole, Rebecca A.

    1997-01-01

    New host, geographic records, or both are established for 14 species of hypoderatid deutonymphs from 14 species of birds in North America. Ten of these records are regarded as examples of a potential host colonization event where these hypopi have become established in hosts other than those with which they are normally associated. Herein, potential host colonization events by hypoderatid deutonymphs are regarded as more of an ecologically determined than physiologically specific phenomenon, often specifically related to sharing of nesting sites in the same rookeries by different host taxa. Neottialges ibisicola Young & Pence is placed as a junior synonym of Neottialges plegadicola Fain. The taxonomic status of Hypodectes propus from columbid versus ardeid hosts needs further study.

  17. Continuous seismic monitoring of Nishinoshima volcano, Izu-Ogasawara, by using long-term ocean bottom seismometers

    NASA Astrophysics Data System (ADS)

    Shinohara, Masanao; Ichihara, Mie; Sakai, Shin'ichi; Yamada, Tomoaki; Takeo, Minoru; Sugioka, Hiroko; Nagaoka, Yutaka; Takagi, Akimichi; Morishita, Taisei; Ono, Tomozo; Nishizawa, Azusa

    2017-11-01

    Nishinoshima in Izu-Ogasawara started erupting in November 2013, and the island size increased. Continuous monitoring is important for study of the formation process. Since it is difficult to make continuous observations on a remote uninhabited island, we started seismic observations near Nishinoshima using ocean bottom seismometers (OBSs) from February 2015. Our OBSs have a recording period of 1 year, and recovery and re-deployment of OBSs were repeated to make continuous observations. The OBSs were deployed with distances of less than 13 km from the crater. Events with particular characteristics were frequently recorded during the eruption period and are estimated to correlate with the release of plumes from the crater by comparison with temporal on-site records using a video camera and microphones. We estimated the number of events using the amplitude average of records to monitor volcanic activity. There were approximately 1800 detected events per day from February to July 2015. The number started to decrease from July 2015, and reached less than 100 per day in November 2015. The surface activity of the volcano was estimated to have ceased in November 2015. Characteristic events began re-occurring in the middle of April 2017. The number of events reached approximately 1400 events per day at the end of May 2017. Seafloor seismic observations using OBSs are a powerful tool for continuous monitoring of island volcanic activity.[Figure not available: see fulltext.

  18. Building an Ontology for Identity Resolution in Healthcare and Public Health

    PubMed Central

    Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P.; Clyde, Stephen; Thornton, Sidney; Staes, Catherine

    2015-01-01

    Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Objectives: Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology’s ability to model identity-changing events over time. Methods: We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. Results: We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. Conclusion: The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage. PMID:26392849

  19. 43 CFR 3277.11 - What records must I keep available for inspection?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... operation of your utilization facility, royalty and production meters, and safety training available for BLM inspection for a period of 6 years following the time the records and information are created. (b) This... needs them to determine: (1) Resource production to a utilization facility; or (2) The allocation of...

  20. Optimal filter parameters for low SNR seismograms as a function of station and event location

    NASA Astrophysics Data System (ADS)

    Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.

    1999-06-01

    Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.

  1. Distributed Data Collection for the ATLAS EventIndex

    NASA Astrophysics Data System (ADS)

    Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.

    2015-12-01

    The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.

  2. Advanced Imaging Utilization and Cost Savings Among Medicare Shared Savings Program Accountable Care Organizations: An Initial Exploratory Analysis.

    PubMed

    Rosenkrantz, Andrew B; Duszak, Richard

    2018-03-01

    The purpose of this study was to explore associations between CT and MRI utilization and cost savings achieved by Medicare Shared Savings Program (MSSP)-participating accountable care organizations (ACOs). Summary data were obtained for all MSSP-participating ACOs (n = 214 in 2013; n = 333 in 2014). Multivariable regressions were performed to assess associations of CT and MRI utilization with ACOs' total savings and reaching minimum savings rates to share in Medicare savings. In 2014, 54.4% of ACOs achieved savings, meeting minimum rates to share in savings in 27.6%. Independent positive predictors of total savings included beneficiary risk scores (β = +20,265,720, P = .003) and MRI events (β = +19,964, P = .018) but not CT events (β = +2,084, P = .635). Independent positive predictors of meeting minimum savings rates included beneficiary risk scores (odds ratio = 2108, P = .001) and MRI events (odds ratio = 1.008, P = .002), but not CT events (odds ratio = 1.002, P = .289). Measures not independently associated with savings were total beneficiaries; beneficiaries' gender, age, race or ethnicity; and Medicare enrollment type (P > .05). For ACOs with 2013 and 2014 data, neither increases nor decreases in CT and MRI events between years were associated with 2014 total savings or meeting savings thresholds (P ≥ .466). Higher MRI utilization rates were independently associated with small but significant MSSP ACO savings. The value of MRI might relate to the favorable impact of appropriate advanced imaging utilization on downstream outcomes and other resource utilization. Because MSSP ACOs represent a highly select group of sophisticated organizations subject to rigorous quality and care coordination standards, further research will be necessary to determine if these associations are generalizable to other health care settings. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  3. Application of effective drought index for quantification of meteorological drought events: a case study in Australia

    NASA Astrophysics Data System (ADS)

    Deo, Ravinesh C.; Byun, Hi-Ryong; Adamowski, Jan F.; Begum, Khaleda

    2017-04-01

    Drought indices (DIs) that quantify drought events by their onset, termination, and subsequent properties such as the severity, duration, and peak intensity are practical stratagems for monitoring and evaluating the impacts of drought. In this study, the effective drought index (EDI) calculated over daily timescales was utilized to quantify short-term (dry spells) and ongoing drought events using drought monitoring data in Australia. EDI was an intensive DI that considered daily water accumulation with a weighting function applied to daily rainfall data with the passage of time. A statistical analysis of the distribution of water deficit period relative to the base period was performed where a run-sum method was adopted to identify drought onset for any day ( i) with EDI i < 0 (rainfall below normal). Drought properties were enumerated in terms of (1) severity (AEDI ≡ accumulated sum of EDIi < 0), (2) duration (DS ≡ cumulative number of days with EDIi < 0), (3) peak intensity (EDImin ≡ minimum EDI of a drought event), (4) annual drought severity (YAEDI ≡ yearly accumulated negative EDI), and (5) accumulated severity of ongoing drought using event-accumulated EDI (EAEDI). The analysis of EDI signal enabled the detection and quantification of a number of drought events in Australia: Federation Drought (1897-1903), 1911-1916 Drought, 1925-1929 Drought, World War II Drought (1937-1945), and Millennium Drought (2002-2010). In comparison with the other droughts, Millennium Drought was exemplified as an unprecedented dry period especially in Victoria (EAEDI ≈ -4243, DS = 1946 days, EDImin = -4.05, and YAEDI = -4903). For the weather station tested in Northern Territory, the worst drought was recorded during 1925-1929 period. The results justified the suitability of effective drought index as a useful scientific tool for monitoring of drought progression, onset and termination, and ranking of drought based on severity, duration, and peak intensity, which allows an assessment of accumulated stress caused by short- and long-term (protracted) dry events.

  4. A new approach to generating research-quality phenology data: The USA National Phenology Monitoring System

    NASA Astrophysics Data System (ADS)

    Denny, Ellen; Miller-Rushing, Abraham; Haggerty, Brian; Wilson, Bruce; Weltzin, Jake

    2010-05-01

    The USA National Phenology Network (www.usanpn.org) has recently initiated a national effort to encourage people at different levels of expertise—from backyard naturalists to professional scientists—to observe phenological events and contribute to a national database that will be used to greatly improve our understanding of spatio-temporal variation in phenology and associated phenological responses to climate change. Traditional phenological observation protocols identify specific single dates at which individual phenological events are observed, but the scientific usefulness of long-term phenological observations can be improved with a more carefully structured protocol. At the USA-NPN we have developed a new approach that directs observers to record each day that they observe an individual plant, and to assess and report the state of specific life stages (or phenophases) as occurring or not occurring on that plant for each observation date. Evaluation is phrased in terms of simple, easy-to-understand, questions (e.g. "Do you see open flowers?"), which makes it very appropriate for a broad audience. From this method, a rich dataset of phenological metrics can be extracted, including the duration of a phenophase (e.g. open flowers), the beginning and end points of a phenophase (e.g. traditional phenological events such as first flower and last flower), multiple distinct occurrences of phenophases within a single growing season (e.g multiple flowering events, common in drought-prone regions), as well as quantification of sampling frequency and observational uncertainties. The system also includes a mechanism for translation of phenophase start and end points into standard traditional phenological events to facilitate comparison of contemporary data collected with this new "phenophase status" monitoring approach to historical datasets collected with the "phenological event" monitoring approach. These features greatly enhance the utility of the resulting data for statistical analyses addressing questions such as how phenological events vary in time and space, and in response to global change.

  5. Morbidity Assessment in Surgery: Refinement Proposal Based on a Concept of Perioperative Adverse Events

    PubMed Central

    Kazaryan, Airazat M.; Røsok, Bård I.; Edwin, Bjørn

    2013-01-01

    Background. Morbidity is a cornerstone assessing surgical treatment; nevertheless surgeons have not reached extensive consensus on this problem. Methods and Findings. Clavien, Dindo, and Strasberg with coauthors (1992, 2004, 2009, and 2010) made significant efforts to the standardization of surgical morbidity (Clavien-Dindo-Strasberg classification, last revision, the Accordion classification). However, this classification includes only postoperative complications and has two principal shortcomings: disregard of intraoperative events and confusing terminology. Postoperative events have a major impact on patient well-being. However, intraoperative events should also be recorded and reported even if they do not evidently affect the patient's postoperative well-being. The term surgical complication applied in the Clavien-Dindo-Strasberg classification may be regarded as an incident resulting in a complication caused by technical failure of surgery, in contrast to the so-called medical complications. Therefore, the term surgical complication contributes to misinterpretation of perioperative morbidity. The term perioperative adverse events comprising both intraoperative unfavourable incidents and postoperative complications could be regarded as better alternative. In 2005, Satava suggested a simple grading to evaluate intraoperative surgical errors. Based on that approach, we have elaborated a 3-grade classification of intraoperative incidents so that it can be used to grade intraoperative events of any type of surgery. Refinements have been made to the Accordion classification of postoperative complications. Interpretation. The proposed systematization of perioperative adverse events utilizing the combined application of two appraisal tools, that is, the elaborated classification of intraoperative incidents on the basis of the Satava approach to surgical error evaluation together with the modified Accordion classification of postoperative complication, appears to be an effective tool for comprehensive assessment of surgical outcomes. This concept was validated in regard to various surgical procedures. Broad implementation of this approach will promote the development of surgical science and practice. PMID:23762627

  6. Big Data Tools as Applied to ATLAS Event Data

    NASA Astrophysics Data System (ADS)

    Vukotic, I.; Gardner, R. W.; Bryant, L. A.

    2017-10-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling characteristics with different numbers of clients, query complexity, and size of the data retrieved.

  7. Second Quarter Hanford Seismic Report for Fiscal Year 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.

    2010-06-30

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 90 local earthquakes during the second quarter of FY 2010. Eighty-one of these earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this quarter were a continuation of the swarm events observed during the 2009 and 2010 fiscal years and reported in previous quarterly and annual reports (Rohay et al; 2009a, 2009b, 2009c, and 2010). Most of the events were considered minor (coda-length magnitude [Mc] less than 1.0) with only 1 event in the 2.0-3.0 range; the maximum magnitude event (3.0 Mc) occurred February 4, 2010 at depth 2.4 km. The average depth of the Wooded Island events during the quarter was 1.6 km with a maximum depth estimated at 3.5 km. This placed the Wooded Island events within the Columbia River Basalt Group (CRBG). The low magnitude of the Wooded Island events has made them undetectable to all but local area residents. The Hanford Strong Motion Accelerometer (SMA) network was triggered several times by these events and the SMA recordings are discussed in section 6.0. During the last year some Hanford employees working within a few miles of the swarm area and individuals living directly across the Columbia River from the swarm center have reported feeling many of the larger magnitude events. Similar earthquake swarms have been recorded near this same location in 1970, 1975 and 1988 but not with SMA readings or satellite imagery. Prior to the 1970s, earthquake swarms may have occurred at this location or elsewhere in the Columbia Basin, but equipment was not in place to record those events. The Wooded Island swarm, due its location and the limited magnitude of the events, does not appear to pose any significant risk to Hanford waste storage facilities. Since swarms of the past did not intensify in magnitude, seismologists do not expect that these events will persist or increase in intensity. However, Pacific Northwest National Laboratory (PNNL) will continue to monitor the activity. Outside of the Wooded Island swarm, nine earthquakes were recorded, seven minor events plus two events with magnitude less than 2.0 Mc. Two earthquakes were located at shallow depths (less than 4 km), three earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and four earthquakes were located at depths greater than 9 km, within the basement. Geographically, six earthquakes were located in known swarm areas and three earthquakes were classified as random events.« less

  8. Properties of Repetitive Long-Period Seismicity at Villarrica Volcano, Chile

    NASA Astrophysics Data System (ADS)

    Richardson, J.; Waite, G. P.; Palma, J.; Johnson, J. B.

    2011-12-01

    Villarrica Volcano, Chile hosts a persistent lava lake and is characterized by degassing and long-period seismicity. In order to better understand the relationship between outgassing and seismicity, we recorded broadband seismic and acoustic data along with high-rate SO2 emission data. We used both a densely-spaced linear array deployed on the northern flank of Villarrica, during the austral summer of 2011, and a wider aperture array of stations distributed around the volcano that was active in the austral summer of 2010. Both deployments consisted of three-component broadband stations and were augmented with broadband infrasound sensors. Of particular interests are repetitive, ~1 Hz seismic and coincident infrasound signals that occurred approximately every 2 minutes. Because these events are typically very low amplitude, we used a matched filter approach to identify them. We windowed several high-amplitude records of these events from broadband seismic stations near the vent. The record section of each event served as a template to compare with the entire dataset by cross-correlation. This approach identified ~20,000 nearly identical events during the ~7 day deployment of the linear array, which were otherwise difficult to identify in the raw records. Assuming that all of the events that we identified have identical source mechanisms and depths, we stack the large suite of events to produce low-noise records and particle motions at receivers farther than 5 km from the vent. We find that the records from stations near the edifice are dominated by tangential particle motion, suggesting the influence of near-field components. Correlation of these data with broadband acoustic data collected at the summit suggest that these repeatable seismic processes are linked to acoustic emissions, probably due to gas bubbles bursting at the magma free surface, as no eruptive products besides gas were being emitted by the volcano during the instrument deployment. The acoustic signals affiliated with the repetitive seismic signals do not seem directly related to the continuous, well-correlated acoustic tremor observed both at the vent and at roughly 6 km away from small-aperture acoustic arrays (also reported by other groups in 2009, 2010). We also correlate the acoustic and repetitive seismic signals with high time resolution (~1 Hz sampling rate), sulfur dioxide emissions measured with an ultraviolet camera. Because a subset of stations operated during both 2010 and 2011, we could tie events from both deployments to generate a single stacked event at all 17 stations. We will present results of finite-difference modeling of this event stack using a simple homogeneous velocity structure.

  9. Meteorite Falls Observed in U.S. Weather Radar Data in 2015 and 2016 (To Date)

    NASA Technical Reports Server (NTRS)

    Fries, Marc; Fries, Jeffrey; Hankey, Mike; Matson, Robert

    2016-01-01

    To date, over twenty meteorite falls have been located in the weather radar imagery of the National Oceanic and Atmospheric Administration (NOAA)'s NEXRAD radar network. We present here the most prominent events recorded since the last Meteoritical Society meeting, covering most of 2015 and early 2016. Meteorite Falls: The following events produced evidence of falling meteorites in radar imagery and resulted in meteorites recovered at the fall site. Creston, CA (24 Oct 2015 0531 UTC): This event generated 218 eyewitness reports submitted to the American Meteor Society (AMS) and is recorded as event #2635 for 2015 on the AMS website. Witnesses reported a bright fireball with fragmentation terminating near the city of Creston, CA, north of Los Angeles. Sonic booms and electrophonic noise were reported in the vicinity of the event. Weather radar imagery records signatures consistent with falling meteorites in data from the KMUX, KVTX, KHNX and KVBX. The Meteoritical Society records the Creston fall as an L6 meteorite with a total recovered mass of 688g. Osceola, FL (24 Jan 2016 1527 UTC): This daytime fireball generated 134 eyewitness reports on AMS report number 266 for 2016, with one credible sonic boom report. The fireball traveled roughly NE to SW with a terminus location north of Lake City, FL in sparsely populated, forested countryside. Radar imagery shows distinct and prominent evidence of a significant meteorite fall with radar signatures seen in data from the KJAX and KVAX radars. Searchers at the fall site found that recoveries were restricted to road sites by the difficult terrain, and yet several meteorites were recovered. Evidence indicates that this was a relatively large meteorite fall where most of the meteorites are unrecoverable due to terrain. Osceola is an L6 meteorite with 991 g total mass recovered to date. Mount Blanco, TX (18 Feb 2016 0343 UTC): This event produced only 39 eyewitness reports and is recorded as AMS event #635 for 2016. No reports of sonic booms or electrophonic noise are recorded in the AMS eyewitness reports, but videos of the event show a relatively long-lasting fireball with fragmentation. Evidence of falling meteorites is seen in radar imagery from the KAMA and KLBB radars defining a roughly WNW to ESE trend with the dominant wind direction. This event featured favorable search ground composed mostly of farmland and ranchland and was extensively searched. Rather surprisingly, only a single L5 chondrite of 36.2g has been recovered to date.

  10. Can I go out for a smoke? A nursing challenge in the epilepsy monitoring unit.

    PubMed

    Hamilton, M; McLachlan, R S; Burneo, J G

    2009-05-01

    Cigarette smoking is common in patients with intractable epilepsy. As a preliminary assessment of epilepsy and smoking, we evaluated the impact of breaks for smoking on the investigation of epilepsy patients admitted to our epilepsy monitoring unit. Absences from the epilepsy unit at the London Health Sciences Center were monitored for 6 months by nursing personnel. During these absences, events that occurred were registered as well. This is possible using portable EEG recorders (XLTEK) that patients carry with them all the time. A disadvantage is that video recording is not available if the patient has a seizure outside the unit. Information was entered consecutively in a datasheet. Diagnosis, duration of hospital stay, frequency of breaks, and time outside the unit were recorded. Descriptive and statistical analysis was performed. Two thousand two hundred and ninety trips were recorded. Mean duration of stay was 10 days for smokers and 8.5 for non-smokers. Non-smokers had a total of 439 seizures of which 6 (1.4%) were not recorded, while the smokers had 213, of which 11 (5.2%) were not recorded. Five events did not have electroencephalographic correlation, raising a suspicion of non-epileptic events (pseudoseizures). Despite the low number of events missed, precious information may be lost during smoking trips by patients admitted to the epilepsy unit. Ways to avoid such trips should be implemented in epilepsy monitoring units allowing smoking breaks for patients.

  11. 12 CFR 219.5 - Conditions for payment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... financial records prior to the time the financial institution is notified of such event. (c) Itemized bill... FOR PROVIDING FINANCIAL RECORDS; RECORDKEEPING REQUIREMENTS FOR CERTAIN FINANCIAL RECORDS (REGULATION S) Reimbursement to Financial Institutions for Providing Financial Records § 219.5 Conditions for...

  12. Elaborative Talk during and after an Event: Conversational Style Influences Children's Memory Reports

    ERIC Educational Resources Information Center

    Hedrick, Amy M.; Haden, Catherine A.; Ornstein, Peter A.

    2009-01-01

    An experimental design was utilized to examine the effects of elaborative talk during and/or after an event on children's event memory reports. Sixty preschoolers were assigned randomly to one of four conditions that varied according to a researcher's use of high- or low- elaborative during- and/or post-event talk about a camping event. In a…

  13. Multi-Station Broad Regional Event Detection Using Waveform Correlation

    NASA Astrophysics Data System (ADS)

    Slinkard, M.; Stephen, H.; Young, C. J.; Eckert, R.; Schaff, D. P.; Richards, P. G.

    2013-12-01

    Previous waveform correlation studies have established the occurrence of repeating seismic events in various regions, and the utility of waveform-correlation event-detection on broad regional or even global scales to find events currently not included in traditionally-prepared bulletins. The computational burden, however, is high, limiting previous experiments to relatively modest template libraries and/or processing time periods. We have developed a distributed computing waveform correlation event detection utility that allows us to process years of continuous waveform data with template libraries numbering in the thousands. We have used this system to process several years of waveform data from IRIS stations in East Asia, using libraries of template events taken from global and regional bulletins. Detections at a given station are confirmed by 1) comparison with independent bulletins of seismicity, and 2) consistent detections at other stations. We find that many of the detected events are not in traditional catalogs, hence the multi-station comparison is essential. In addition to detecting the similar events, we also estimate magnitudes very precisely based on comparison with the template events (when magnitudes are available). We have investigated magnitude variation within detected families of similar events, false alarm rates, and the temporal and spatial reach of templates.

  14. Precise dating of Dansgaard-Oeschger climate oscillations in western Europe from stalagmite data.

    PubMed

    Genty, D; Blamart, D; Ouahdi, R; Gilmour, M; Baker, A; Jouzel, J; Van-Exter, Sandra

    2003-02-20

    The signature of Dansgaard-Oeschger events--millennial-scale abrupt climate oscillations during the last glacial period--is well established in ice cores and marine records. But the effects of such events in continental settings are not as clear, and their absolute chronology is uncertain beyond the limit of (14)C dating and annual layer counting for marine records and ice cores, respectively. Here we present carbon and oxygen isotope records from a stalagmite collected in southwest France which have been precisely dated using 234U/230Th ratios. We find rapid climate oscillations coincident with the established Dansgaard-Oeschger events between 83,000 and 32,000 years ago in both isotope records. The oxygen isotope signature is similar to a record from Soreq cave, Israel, and deep-sea records, indicating the large spatial scale of the climate oscillations. The signal in the carbon isotopes gives evidence of drastic and rapid vegetation changes in western Europe, an important site in human cultural evolution. We also find evidence for a long phase of extremely cold climate in southwest France between 61.2 +/- 0.6 and 67.4 +/- 0.9 kyr ago.

  15. Tephrochronology and the extended intimate (integration of ice-core, marine and terrestrial records) event stratigraphy 8-128 ka b2k

    NASA Astrophysics Data System (ADS)

    Blockley, Simon P. E.; Bourne, Anna J.; Brauer, Achim; Davies, Siwan M.; Hardiman, Mark; Harding, Poppy R.; Lane, Christine S.; MacLeod, Alison; Matthews, Ian P.; Pyne-O'Donnell, Sean D. F.; Rasmussen, Sune O.; Wulf, Sabine; Zanchetta, Giovanni

    2014-12-01

    The comparison of palaeoclimate records on their own independent timescales is central to the work of the INTIMATE (INTegrating Ice core, MArine and TErrestrial records) network. For the North Atlantic region, an event stratigraphy has been established from the high-precision Greenland ice-core records and the integrated GICC05 chronology. This stratotype provides a palaeoclimate signal to which the timing and nature of palaeoenvironmental change recorded in marine and terrestrial archives can be compared. To facilitate this wider comparison, without assuming synchroneity of climatic change/proxy response, INTIMATE has also focussed on the development of tools to achieve this. In particular the use of time-parallel marker horizons e.g. tephra layers (volcanic ash). Coupled with the recent temporal extension of the Greenland stratotype, as part of this special issue, we present an updated INTIMATE event stratigraphy highlighting key tephra horizons used for correlation across Europe and the North Atlantic. We discuss the advantages of such an approach, and the key challenges for the further integration of terrestrial palaeoenvironmental records with those from ice cores and the marine realm.

  16. Identification of Geomorphic Conditions Favoring Preservation of Multiple Individual Displacements Across Transform Faults

    NASA Astrophysics Data System (ADS)

    Williams, P. L.; Phillips, D. A.; Bowles-Martinez, E.; Masana, E.; Stepancikova, P.

    2010-12-01

    Terrestrial and airborne LiDAR data, and low altitude aerial photography have been utilized in conjunction with field work to identify and map single and multiple-event stream-offsets along all strands of the San Andreas fault in the Coachella Valley. Goals of the work are characterizing the range of displacements associated with the fault’s prehistoric surface ruptures, evaluating patterns of along-fault displacement, and disclosing processes associated with the prominent Banning-Mission Creek fault junction. Preservation offsets is associated with landscape conditions including: (1) well-confined and widely spaced source streams up-slope of the fault; (2) persistent geomorphic surfaces below the fault; (3) slope directions oriented approximately perpendicular to the fault. Notably, a pair of multiple-event offset sites have been recognized in coarse fan deposits below the Mission Creek fault near 1000 Palms oasis. Each of these sites is associated with a single source drainage oriented approximately perpendicular to the fault, and preserves a record of individual fault displacements affecting the southern portion of the Mission Creek branch of the San Andreas fault. The two sites individually record long (>10 event) slip-per-event histories. Documentation of the sites indicates a prevalence of moderate displacements and a small number of large offsets. This is consistent with evidence developed in systematic mapping of individual and multiple event stream offsets in the area extending 70 km south to Durmid Hill. Challenges to site interpretation include the presence of closely spaced en echelon fault branches and indications of stream avulsion in the area of the modern fault crossing. Conversely, strong bar and swale topography produce high quality offset indicators that can be identified across en echelon branches in most cases. To accomplish the detailed mapping needed to fully recover the complex yet well-preserved geomorphic features under investigation, a program of terrestrial laser scanning (TLS) was conducted at the 1000 Palms oasis stream offset sites. Data products and map interpretations will be presented along with initial applications of the study to characterizing San Andreas fault rupture hazard. Continuing work will seek to more fully populate the dataset of larger offsets, evaluate means to objectively date the larger offsets, and, as completely as possible, to characterize magnitudes of past surface ruptures of the San Andreas fault in the Coachella Valley.

  17. A mobile phone food record app to digitally capture dietary intake for adolescents in a free-living environment: usability study.

    PubMed

    Casperson, Shanon L; Sieling, Jared; Moon, Jon; Johnson, LuAnn; Roemmich, James N; Whigham, Leah

    2015-03-13

    Mobile technologies are emerging as valuable tools to collect and assess dietary intake. Adolescents readily accept and adopt new technologies; thus, a food record app (FRapp) may be a useful tool to better understand adolescents' dietary intake and eating patterns. We sought to determine the amenability of adolescents, in a free-living environment with minimal parental input, to use the FRapp to record their dietary intake. Eighteen community-dwelling adolescents (11-14 years) received detailed instructions to record their dietary intake for 3-7 days using the FRapp. Participants were instructed to capture before and after images of all foods and beverages consumed and to include a fiducial marker in the image. Participants were also asked to provide text descriptors including amount and type of all foods and beverages consumed. Eight of 18 participants were able to follow all instructions: included pre- and post-meal images, a fiducial marker, and a text descriptor and collected diet records on 2 weekdays and 1 weekend day. Dietary intake was recorded on average for 3.2 (SD 1.3 days; 68% weekdays and 32% weekend days) with an average of 2.2 (SD 1.1) eating events per day per participant. A total of 143 eating events were recorded, of which 109 had at least one associated image and 34 were recorded with text only. Of the 109 eating events with images, 66 included all foods, beverages and a fiducial marker and 44 included both a pre- and post-meal image. Text was included with 78 of the captured images. Of the meals recorded, 36, 33, 35, and 39 were breakfasts, lunches, dinners, and snacks, respectively. These data suggest that mobile devices equipped with an app to record dietary intake will be used by adolescents in a free-living environment; however, a minority of participants followed all directions. User-friendly mobile food record apps may increase participant amenability, increasing our understanding of adolescent dietary intake and eating patterns. To improve data collection, the FRapp should deliver prompts for tasks, such as capturing images before and after each eating event, including the fiducial marker in the image, providing complete and accurate text information, and ensuring all eating events are recorded and should be customizable to individuals and to different situations. Clinicaltrials.gov NCT01803997. http://clinicaltrials.gov/ct2/show/NCT01803997 (Archived at: http://www.webcitation.org/6WiV1vxoR).

  18. The 2015/16 El Niño Event as Recorded in Central Tropical Pacific Corals: Temperature, Hydrology, and Ocean Circulation Influences

    NASA Astrophysics Data System (ADS)

    O'Connor, G.; Cobb, K. M.; Sayani, H. R.; Grothe, P. R.; Atwood, A. R.; Stevenson, S.; Hitt, N. T.; Lynch-Stieglitz, J.

    2016-12-01

    The El Niño/Southern Oscillation (ENSO) of 2015/2016 was a record-breaking event in the central Pacific, driving profound changes in the properties of the ocean and atmosphere. Prolonged ocean warming of up to 3°C translated into a large-scale coral bleaching and mortality event on Christmas Island (2°N, 157°W) that very few individuals escaped unscathed. As part of a long-term, interdisciplinary monitoring effort underway since August 2014, we present results documenting the timing and magnitude of environmental changes on the Christmas Island reefs. In particular, we present the first coral geochemical time series spanning the last several years, using cores that were drilled from rare living coral colonies during a field expedition in April 2016, at the tail end of the event. These geochemical indicators are sensitive to both ocean temperature, salinity, and water mass properties and have been used to quantitatively reconstruct ENSO extremes of the recent [Nurhati et al., 2011] and distant [Cobb et al., 2013] past. By analyzing multiple cores from both open ocean and lagoonal settings, we are able to undertake a quantitative comparison of this event with past very strong El Niño events contained in the coral archive - including the 1940/41, 1972/73, and 1997/98 events. For the most recent event, we compare our coral geochemistry records with a rich suite of in situ environmental data, including physical and geochemical parameters collected as part of the NOAA rapid response campaign in the central tropical Pacific. This unique dataset not only provides physical context interpreting coral geochemical records from the central tropical Pacific, but allows us to assess why the 2015/2016 El Niño event was so devastating to coral reef ecosystems in this region.

  19. Estimation of red-light running frequency using high-resolution traffic and signal data.

    PubMed

    Chen, Peng; Yu, Guizhen; Wu, Xinkai; Ren, Yilong; Li, Yueguang

    2017-05-01

    Red-light-running (RLR) emerges as a major cause that may lead to intersection-related crashes and endanger intersection safety. To reduce RLR violations, it's critical to identify the influential factors associated with RLR and estimate RLR frequency. Without resorting to video camera recordings, this study investigates this important issue by utilizing high-resolution traffic and signal event data collected from loop detectors at five intersections on Trunk Highway 55, Minneapolis, MN. First, a simple method is proposed to identify RLR by fully utilizing the information obtained from stop bar detectors, downstream entrance detectors and advance detectors. Using 12 months of event data, a total of 6550 RLR cases were identified. According to a definition of RLR frequency as the conditional probability of RLR on a certain traffic or signal condition (veh/1000veh), the relationships between RLR frequency and some influential factors including arriving time at advance detector, approaching speed, headway, gap to the preceding vehicle on adjacent lane, cycle length, geometric characteristics and even snowing weather were empirically investigated. Statistical analysis shows good agreement with the traffic engineering practice, e.g., RLR is most likely to occur on weekdays during peak periods under large traffic demands and longer signal cycles, and a total of 95.24% RLR events occurred within the first 1.5s after the onset of red phase. The findings confirmed that vehicles tend to run the red light when they are close to intersection during phase transition, and the vehicles following the leading vehicle with short headways also likely run the red light. Last, a simplified nonlinear regression model is proposed to estimate RLR frequency based on the data from advance detector. The study is expected to helpbetter understand RLR occurrence and further contribute to the future improvement of intersection safety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Colesevelam, Ezetimibe, and Patients With Type 2 Diabetes Mellitus: Characteristics and Clinical Outcomes From a Health Care Database.

    PubMed

    Swindle, Jason P; Ye, Xin; Mallick, Rajiv; Song, Rui; Horstman, Thomas; Bays, Harold E

    2014-07-01

    Despite the prevalence of therapies available to patients at highest coronary heart disease risk, only a minority of type 2 diabetes mellitus (T2DM) patients reach desired cholesterol treatment levels, with limited data regarding their outcomes. To examine "real-world" effectiveness of initiating treatment with either colesevelam or ezetimibe among individuals with evidence of T2DM and hypercholesterolemia (HCh). Key outcomes included treatment patterns and cardiovascular (CV) events. This retrospective administrative claims-based study utilized medical, pharmacy, and enrollment data linked to laboratory results information from a large United States health plan (January 1, 2006, to March 31, 2011) and included individuals with recorded evidence of T2DM and HCh. The index date was the date of first pharmacy claim for colesevelam or ezetimibe, with cohort assignment based on index medication. Assessments included baseline characteristics, follow-up treatment patterns, and composite CV event, with propensity score matching to correct for sample selection bias. In total, 4231 individuals were identified with evidence of HCh and T2DM (ezetimibe n = 3384; colesevelam n = 847). After matching, the baseline characteristics between cohorts were rendered to be similar. Mean days of persistent medication use was lower with colesevelam compared with ezetimibe (P < 0.001). Compared with ezetimibe, a smaller percentage of individuals in the colesevelam cohort experienced a follow-up composite CV event, and adjusted Cox model results suggested decreased risk (hazard ratio = 0.58; P = 0.004) of a follow-up composite CV event. In this health care database analysis among patients with HCh and T2DM, colesevelam was associated with decreased risk of a composite CV event compared with ezetimibe, despite lower persistence. © The Author(s) 2014.

  1. Post-rift Tectonic History of the Songliao Basin, NE China: Cooling Events and Post-rift Unconformities Driven by Orogenic Pulses From Plate Boundaries

    NASA Astrophysics Data System (ADS)

    Song, Ying; Stepashko, Andrei; Liu, Keyu; He, Qingkun; Shen, Chuanbo; Shi, Bingjie; Ren, Jianye

    2018-03-01

    The classic lithosphere-stretching model predicts that the post-rift evolution of extensional basin should be exclusively controlled by decaying thermal subsidence. However, the stratigraphy of the Songliao Basin in northeastern China shows that the post-rift evolution was punctuated by multiple episodes of uplift and exhumation events, commonly attributed to the response to regional tectonic events, including the far-field compression from plate margins. Three prominent tectonostratigraphic post-rift unconformities are recognized in the Late Cretaceous strata of the basin: T11, T03, and T02. The subsequent Cenozoic history is less constrained due to the incomplete record of younger deposits. In this paper, we utilize detrital apatite fission track (AFT) thermochronology to unravel the enigmatic timing and origin of post-rift unconformities. Relating the AFT results to the unconformities and other geological data, we conclude that in the post-rift stage, the basin experienced a multiepisodic tectonic evolution with four distinct cooling and exhumation events. The thermal history and age pattern document the timing of the unconformities in the Cretaceous succession: the T11 unconformity at 88-86 Ma, the T03 unconformity at 79-75 Ma, and the T02 unconformity at 65-50 Ma. A previously unrecognized Oligocene unconformity is also defined by a 32-24 Ma cooling event. Tectonically, all the cooling episodes were regional, controlled by plate boundary stresses. We propose that Pacific dynamics influenced the wider part of eastern Asia during the Late Cretaceous until Cenozoic, whereas the far-field effects of the Neo-Tethys subduction and collision processes became another tectonic driver in the later Cenozoic.

  2. Investigating the Functional Utility of the Left Parietal ERP Old/New Effect: Brain Activity Predicts within But Not between Participant Variance in Episodic Recollection.

    PubMed

    MacLeod, Catherine A; Donaldson, David I

    2017-01-01

    A success story within neuroimaging has been the discovery of distinct neural correlates of episodic retrieval, providing insight into the processes that support memory for past life events. Here we focus on one commonly reported neural correlate, the left parietal old/new effect, a positive going modulation seen in event-related potential (ERP) data that is widely considered to index episodic recollection. Substantial evidence links changes in the size of the left parietal effect to changes in remembering, but the precise functional utility of the effect remains unclear. Here, using forced choice recognition of verbal stimuli, we present a novel population level test of the hypothesis that the magnitude of the left parietal effect correlates with memory performance. We recorded ERPs during old/new recognition, source accuracy and Remember/Know/Guess tasks in two large samples of healthy young adults, and successfully replicated existing within participant modulations of the magnitude of the left parietal effect with recollection. Critically, however, both datasets also show that across participants the magnitude of the left parietal effect does not correlate with behavioral measures of memory - including both subjective and objective estimates of recollection. We conclude that in these tasks, and across this healthy young adult population, the generators of the left parietal ERP effect do not index performance as expected. Taken together, these novel findings provide important constraints on the functional interpretation of the left parietal effect, suggesting that between group differences in the magnitude of old/new effects cannot always safely be used to infer differences in recollection.

  3. Neural correlates of multimodal metaphor comprehension: Evidence from event-related potentials and time-frequency decompositions.

    PubMed

    Ma, Qingguo; Hu, Linfeng; Xiao, Can; Bian, Jun; Jin, Jia; Wang, Qiuzhen

    2016-11-01

    The present study examined the event-related potential (ERP) and time-frequency components correlates with the comprehension process of multimodal metaphors that were represented by the combination of "a vehicle picture+a written word of an animal". Electroencephalogram data were recorded when participants decided whether the metaphor using an animal word for the vehicle rendered by a picture was appropriate or not. There were two conditions: appropriateness (e.g., sport utility vehicles+tiger) vs. inappropriateness (e.g., sport utility vehicles+cat). The ERP results showed that inappropriate metaphor elicited larger N300 (280-360ms) and N400 (380-460ms) amplitude than appropriate one, which were different from previous exclusively verbal metaphor studies that rarely observed the N300 effect. A P600 (550-750ms) was also observed and larger in appropriate metaphor condition. Besides, the time-frequency principal component analysis revealed that two independent theta activities indexed the separable processes (retrieval of semantic features and semantic integration) underlying the N300 and N400. Delta band was also induced within a later time window and best characterized the integration process underlying P600. These results indicate the specific cognitive mechanism of multimodal metaphor comprehension that is different from verbal metaphor processing, mirrored by several separable processes indexed by ERP components and time-frequency components. The present study extends the metaphor research by uncovering the functional roles of delta and theta as well as their unique contributions to the ERP components during multimodal metaphor comprehension. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. 7 CFR 1.115 - Special procedures: Medical records.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 1 2013-01-01 2013-01-01 false Special procedures: Medical records. 1.115 Section 1... Regulations § 1.115 Special procedures: Medical records. In the event an agency receives a request pursuant to § 1.112 for access to medical records (including psychological records) whose disclosure it determines...

  5. 7 CFR 1.115 - Special procedures: Medical records.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 1 2014-01-01 2014-01-01 false Special procedures: Medical records. 1.115 Section 1... Regulations § 1.115 Special procedures: Medical records. In the event an agency receives a request pursuant to § 1.112 for access to medical records (including psychological records) whose disclosure it determines...

  6. 29 CFR 1611.6 - Special procedures: Medical records.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Special procedures: Medical records. 1611.6 Section 1611.6... REGULATIONS § 1611.6 Special procedures: Medical records. In the event the Commission receives a request pursuant to § 1611.3 for access to medical records (including psychological records) whose disclosure of...

  7. 29 CFR 1611.6 - Special procedures: Medical records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Special procedures: Medical records. 1611.6 Section 1611.6... REGULATIONS § 1611.6 Special procedures: Medical records. In the event the Commission receives a request pursuant to § 1611.3 for access to medical records (including psychological records) whose disclosure of...

  8. 29 CFR 1611.6 - Special procedures: Medical records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Special procedures: Medical records. 1611.6 Section 1611.6... REGULATIONS § 1611.6 Special procedures: Medical records. In the event the Commission receives a request pursuant to § 1611.3 for access to medical records (including psychological records) whose disclosure of...

  9. Overcoming structural constraints to patient utilization of electronic medical records: a critical review and proposal for an evaluation framework.

    PubMed

    Winkelman, Warren J; Leonard, Kevin J

    2004-01-01

    There are constraints embedded in medical record structure that limit use by patients in self-directed disease management. Through systematic review of the literature from a critical perspective, four characteristics that either enhance or mitigate the influence of medical record structure on patient utilization of an electronic patient record (EPR) system have been identified: environmental pressures, physician centeredness, collaborative organizational culture, and patient centeredness. An evaluation framework is proposed for use when considering adaptation of existing EPR systems for online patient access. Exemplars of patient-accessible EPR systems from the literature are evaluated utilizing the framework. From this study, it appears that traditional information system research and development methods may not wholly capture many pertinent social issues that arise when expanding access of EPR systems to patients. Critically rooted methods such as action research can directly inform development strategies so that these systems may positively influence health outcomes.

  10. Climate change and carbon-cycling during the latest Cretaceous-Early Paleogene; a new 13.5 million year-long, orbital-resolution, stable isotope record from the South Atlantic

    NASA Astrophysics Data System (ADS)

    Barnet, J.; Littler, K.; Kroon, D.; Leng, M. J.; Westerhold, T.; Roehl, U.; Zachos, J. C.

    2017-12-01

    The "greenhouse" world of the latest Cretaceous-Early Paleogene ( 70-34 Ma) was characterised by multi-million year variability in climate and the carbon-cycle. Throughout this interval the pervasive imprint of orbital-cyclicity, particularly eccentricity and precession, is visible in elemental and stable isotope data obtained from multiple deep-sea sites. Periodic "hyperthermal" events, occurring largely in-step with these orbital cycles, have proved particularly enigmatic, and may be the closest, albeit imperfect, analogues for anthropogenic climate change. This project utilises CaCO3-rich marine sediments recovered from ODP Site 1262 at a paleo-depth of 3600 m on the Walvis Ridge, South Atlantic, of late Maastrichtian-mid Paleocene age ( 67-60 Ma). We have derived high-resolution (2.5-4 kyr) carbon and oxygen isotope data from the epifaunal benthic foraminifera species Nuttallides truempyi. Combining the new record with the existing Late Paleocene-Early Eocene record generated from the same site by Littler et al. (2014), yields a single-site reference curve detailing 13.5 million years of orbital cyclicity in paleoclimate and carbon cycle from the latest Cretaceous to near the peak warmth of the Early Paleogene greenhouse. Spectral analysis of this new combined dataset allows us to identify long (405-kyr) eccentricity, short (100-kyr) eccentricity, and precession (19-23-kyr) as the principle forcing mechanisms governing pacing of the background climate and carbon-cycle during this time period, with a comparatively weak obliquity (41-kyr) signal. Cross-spectral analysis suggests that changes in climate lead the carbon cycle throughout most of the record, emphasising the role of the release of temperature-sensitive carbon stores as a positive feedback to an initial warming induced by changes in orbital configuration. The expression of comparatively understudied Early Paleocene events, including the Dan-C2 Event, Latest Danian Event, and Danian/Selandian Transition Event, are also identified within this new record, confirming the global nature and orbital pacing of the Latest Danian Event and Danian/Selandian Transition Event, but questioning the Dan-C2 event as a global hyperthermal.

  11. Spectrum Savings from High Performance Recording and Playback Onboard the Test Article

    DTIC Science & Technology

    2013-02-20

    execute within a Windows 7 environment, and data is recorded on SSDs. The underlying database is implemented using MySQL . Figure 1 illustrates the... MySQL database. This is effectively the time at which the recorded data are available for retransmission. CPU and Memory utilization were collected...17.7% MySQL avg. 3.9% EQDR Total avg. 21.6% Table 1 CPU Utilization with260 Mbits/sec Load The difference between the total System CPU (27.8

  12. The effects of high-frequency oscillations in hippocampal electrical activities on the classification of epileptiform events using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Chiu, Alan W. L.; Jahromi, Shokrollah S.; Khosravani, Houman; Carlen, Peter L.; Bardakjian, Berj L.

    2006-03-01

    The existence of hippocampal high-frequency electrical activities (greater than 100 Hz) during the progression of seizure episodes in both human and animal experimental models of epilepsy has been well documented (Bragin A, Engel J, Wilson C L, Fried I and Buzsáki G 1999 Hippocampus 9 137-42 Khosravani H, Pinnegar C R, Mitchell J R, Bardakjian B L, Federico P and Carlen P L 2005 Epilepsia 46 1-10). However, this information has not been studied between successive seizure episodes or utilized in the application of seizure classification. In this study, we examine the dynamical changes of an in vitro low Mg2+ rat hippocampal slice model of epilepsy at different frequency bands using wavelet transforms and artificial neural networks. By dividing the time-frequency spectrum of each seizure-like event (SLE) into frequency bins, we can analyze their burst-to-burst variations within individual SLEs as well as between successive SLE episodes. Wavelet energy and wavelet entropy are estimated for intracellular and extracellular electrical recordings using sufficiently high sampling rates (10 kHz). We demonstrate that the activities of high-frequency oscillations in the 100-400 Hz range increase as the slice approaches SLE onsets and in later episodes of SLEs. Utilizing the time-dependent relationship between different frequency bands, we can achieve frequency-dependent state classification. We demonstrate that activities in the frequency range 100-400 Hz are critical for the accurate classification of the different states of electrographic seizure-like episodes (containing interictal, preictal and ictal states) in brain slices undergoing recurrent spontaneous SLEs. While preictal activities can be classified with an average accuracy of 77.4 ± 6.7% utilizing the frequency spectrum in the range 0-400 Hz, we can also achieve a similar level of accuracy by using a nonlinear relationship between 100-400 Hz and <4 Hz frequency bands only.

  13. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  14. A correlative and quantitative imaging approach enabling characterization of primary cell-cell communication: Case of human CD4+ T cell-macrophage immunological synapses.

    PubMed

    Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie

    2018-05-22

    Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.

  15. Quantification of Interbasin Transfers into the Addicks Reservoir during Hurricane Harvey

    NASA Astrophysics Data System (ADS)

    Sebastian, A.; Juan, A.; Gori, A.; Maulsby, F.; Bedient, P. B.

    2017-12-01

    Between August 25 and 30, Hurricane Harvey dropped unprecedented rainfall over southeast Texas causing widespread flooding in the City of Houston. Water levels in the Addicks and Barker reservoirs, built in the 1940s to protect downtown Houston, exceeded previous records by approximately 2 meters. Concerns regarding structural integrity of the dams and damage to neighbourhoods in within the reservoir pool resulted in controlled releases into Buffalo Bayou, flooding an estimated 4,000 additional structures downstream of the dams. In 2016, during the Tax Day it became apparent that overflows from Cypress Creek in northern Harris County substantially contribute to water levels in Addicks. Prior to this event, little was known about the hydrodynamics of this overflow area or about the additional stress placed on Addicks and Barker reservoirs due to the volume of overflow. However, this information is critical for determining flood risk in Addicks Watershed, and ultimately Buffalo Bayou. In this study, we utilize the recently developed HEC-RAS 2D model the interbasin transfer that occurs between Cypress Creek Watershed and Addicks Reservoir to quantify the volume and rate at which water from Cypress enters the reservoir during extreme events. Ultimately, the results of this study will help inform the official hydrologic models used by HCFCD to determine reservoir operation during future storm events and better inform residents living in or above the reservoir pool about their potential flood risk.

  16. Suicidal ideation among adolescents following natural disaster: The role of prior interpersonal violence.

    PubMed

    Zuromski, Kelly L; Resnick, Heidi; Price, Matthew; Galea, Sandro; Kilpatrick, Dean G; Ruggiero, Kenneth

    2018-05-07

    The current study examined variables, including prior traumatic events, disaster exposure, and current mental health symptomatology, associated with suicidal ideation following experience of a natural disaster. Utilizing a sample of 2,000 adolescents exposed to the spring 2011 tornadoes in the areas surrounding Tuscaloosa, Alabama, and Joplin, Missouri, we hypothesized that prior interpersonal violence (IPV), more so than other prior traumatic events or other symptoms, would be associated with suicidal ideation after the disaster. Suicidal ideation was reported by approximately 5% of the sample. Results of binary logistic regression were consistent with hypotheses in that prior IPV exposure emerged as the variable most robustly related to presence of postdisaster suicidal ideation, even accounting for current symptoms (i.e., posttraumatic stress disorder and depression). Moreover, neither prior accident nor prior natural disaster exposure was significantly associated with postdisaster suicidal ideation, suggesting that something specific to IPV may be conferring risk for suicidality. No other variables, including disaster exposure variables or demographic characteristics, emerged as significantly related. Our results suggest that individuals who have a history of IPV may be particularly vulnerable following experience of additional traumatic events and that for suicide risk, the experience of prior IPV may be more relevant to consider in the aftermath of natural disasters beyond variables related to the index trauma or current symptomatology. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Multiple-Array Detection, Association and Location of Infrasound and Seismo-Acoustic Events - Utilization of Ground Truth Information

    DTIC Science & Technology

    2010-09-01

    MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC EVENTS – UTILIZATION OF GROUND TRUTH INFORMATION Stephen J...and infrasound data from seismo-acoustic arrays and apply the methodology to regional networks for validation with ground truth information. In the...initial year of the project automated techniques for detecting, associating and locating infrasound signals were developed. Recently, the location

  18. 25 CFR 542.33 - What are the minimum internal control standards for surveillance for Tier B gaming operations?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... capability to display the date and time of recorded events on video and/or digital recordings. The displayed... digital record retention. (1) All video recordings and/or digital records of coverage provided by the.... (3) Duly authenticated copies of video recordings and/or digital records shall be provided to the...

  19. 25 CFR 542.33 - What are the minimum internal control standards for surveillance for Tier B gaming operations?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... capability to display the date and time of recorded events on video and/or digital recordings. The displayed... digital record retention. (1) All video recordings and/or digital records of coverage provided by the.... (3) Duly authenticated copies of video recordings and/or digital records shall be provided to the...

  20. 25 CFR 542.33 - What are the minimum internal control standards for surveillance for Tier B gaming operations?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... capability to display the date and time of recorded events on video and/or digital recordings. The displayed... digital record retention. (1) All video recordings and/or digital records of coverage provided by the.... (3) Duly authenticated copies of video recordings and/or digital records shall be provided to the...

  1. 25 CFR 542.33 - What are the minimum internal control standards for surveillance for Tier B gaming operations?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... capability to display the date and time of recorded events on video and/or digital recordings. The displayed... digital record retention. (1) All video recordings and/or digital records of coverage provided by the.... (3) Duly authenticated copies of video recordings and/or digital records shall be provided to the...

  2. 25 CFR 542.33 - What are the minimum internal control standards for surveillance for Tier B gaming operations?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... capability to display the date and time of recorded events on video and/or digital recordings. The displayed... digital record retention. (1) All video recordings and/or digital records of coverage provided by the.... (3) Duly authenticated copies of video recordings and/or digital records shall be provided to the...

  3. Development and Validation of Perioperative Risk-Adjustment Models for Hip Fracture Repair, Total Hip Arthroplasty, and Total Knee Arthroplasty.

    PubMed

    Schilling, Peter L; Bozic, Kevin J

    2016-01-06

    Comparing outcomes across providers requires risk-adjustment models that account for differences in case mix. The burden of data collection from the clinical record can make risk-adjusted outcomes difficult to measure. The purpose of this study was to develop risk-adjustment models for hip fracture repair (HFR), total hip arthroplasty (THA), and total knee arthroplasty (TKA) that weigh adequacy of risk adjustment against data-collection burden. We used data from the American College of Surgeons National Surgical Quality Improvement Program to create derivation cohorts for HFR (n = 7000), THA (n = 17,336), and TKA (n = 28,661). We developed logistic regression models for each procedure using age, sex, American Society of Anesthesiologists (ASA) physical status classification, comorbidities, laboratory values, and vital signs-based comorbidities as covariates, and validated the models with use of data from 2012. The derivation models' C-statistics for mortality were 80%, 81%, 75%, and 92% and for adverse events were 68%, 68%, 60%, and 70% for HFR, THA, TKA, and combined procedure cohorts. Age, sex, and ASA classification accounted for a large share of the explained variation in mortality (50%, 58%, 70%, and 67%) and adverse events (43%, 45%, 46%, and 68%). For THA and TKA, these three variables were nearly as predictive as models utilizing all covariates. HFR model discrimination improved with the addition of comorbidities and laboratory values; among the important covariates were functional status, low albumin, high creatinine, disseminated cancer, dyspnea, and body mass index. Model performance was similar in validation cohorts. Risk-adjustment models using data from health records demonstrated good discrimination and calibration for HFR, THA, and TKA. It is possible to provide adequate risk adjustment using only the most predictive variables commonly available within the clinical record. This finding helps to inform the trade-off between model performance and data-collection burden as well as the need to define priorities for data capture from electronic health records. These models can be used to make fair comparisons of outcome measures intended to characterize provider quality of care for value-based-purchasing and registry initiatives. Copyright © 2016 by The Journal of Bone and Joint Surgery, Incorporated.

  4. Reconstruction of Flooding Events for the Central Valley, California from Instrumental and Documentary Weather Records

    NASA Astrophysics Data System (ADS)

    Dodds, S. F.; Mock, C. J.

    2009-12-01

    All available instrumental winter precipitation data for the Central Valley of California back to 1850 were digitized and analyzed to construct continuous time series. Many of these data, in paper or microfilm format, extend prior to modern National Weather Service Cooperative Data Program and Historical Climate Network data, and were recorded by volunteer observers from networks such as the US Army Surgeon General, Smithsonian Institution, and US Army Signal Service. Given incomplete individual records temporally, detailed documentary data from newspapers, personal diaries and journals, ship logbooks, and weather enthusiasts’ instrumental data, were used in conjunction with instrumental data to reconstruct precipitation frequency per month and season, continuous days of precipitation, and to identify anomalous precipitation events. Multilinear regression techniques, using surrounding stations and the relationships between modern and historical records, bridge timeframes lacking data and provided homogeneous nature of time series. The metadata for each station was carefully screened, and notes were made about any possible changes to the instrumentation, location of instruments, or an untrained observer to verify that anomalous events were not recorded incorrectly. Precipitation in the Central Valley varies throughout the entire region, but waterways link the differing elevations and latitudes. This study integrates the individual station data with additional accounts of flood descriptions through unique newspaper and journal data. River heights and flood extent inundating cities, agricultural lands, and individual homes are often recorded within unique documentary sources, which add to the understanding of flood occurrence within this area. Comparisons were also made between dam and levee construction through time and how waters are diverted through cities in natural and anthropogenically changed environments. Some precipitation that lead to flooding events that occur in the Central Valley in the mid-19th century through the early 20th century are more outstanding at some particular stations than the modern records include. Several years that are included in the study are 1850, 1862, 1868, 1878, 1881, 1890, and 1907. These flood years were compared to the modern record and reconstructed through time series and maps. Incorporating the extent and effects these anomalous events in future climate studies could improve models and preparedness for the future floods.

  5. 32 CFR 705.34 - Other special events.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 5 2014-07-01 2014-07-01 false Other special events. 705.34 Section 705.34... AND OFFICIAL RECORDS PUBLIC AFFAIRS REGULATIONS § 705.34 Other special events. (a) Ship visits. Requests for visits generally originate with civic groups desiring Navy participation in local events...

  6. 32 CFR 705.34 - Other special events.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 5 2013-07-01 2013-07-01 false Other special events. 705.34 Section 705.34... AND OFFICIAL RECORDS PUBLIC AFFAIRS REGULATIONS § 705.34 Other special events. (a) Ship visits. Requests for visits generally originate with civic groups desiring Navy participation in local events...

  7. 32 CFR 705.34 - Other special events.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 5 2012-07-01 2012-07-01 false Other special events. 705.34 Section 705.34... AND OFFICIAL RECORDS PUBLIC AFFAIRS REGULATIONS § 705.34 Other special events. (a) Ship visits. Requests for visits generally originate with civic groups desiring Navy participation in local events...

  8. Clinical profile and outcomes in octogenarians with atrial fibrillation: A community-based study in a specific European health care area.

    PubMed

    Rodríguez-Mañero, Moisés; López-Pardo, Estrella; Cordero, Alberto; Kredieh, Omar; Pereira-Vazquez, María; Martínez-Sande, Jose-Luis; Martínez-Gomez, Alvaro; Peña-Gil, Carlos; Novo-Platas, José; García-Seara, Javier; Mazón, Pilar; Laje, Ricardo; Moscoso, Isabel; Varela-Román, Alfonso; García-Acuña, Jose María; González-Juanatey, José Ramón

    2017-09-15

    Age increases risk of stroke and bleeding. Clinical trial data have had relatively low proportions of elderly subjects. We sought to study a Spanish population of octogenarians with atrial fibrillation (AF) by combining different sources of electronic clinical records from an area where all medical centres utilized electronic health record systems. Data was derived from the Galician Healthcare Service information system. From 383,000 subjects, AF was coded in 7990 (2.08%), 3640 (45.6%) of whom were ≥80 and 4350 (54.4%)<80. All CHA2DS2-VASc's components were more prevalent in the elderly except for diabetes. Of those ≥80, 2178 (59.8%) were women. Mean CHA 2 DS 2 -VASc was 4.2±1.1. Distribution of CHA 2 DS 2 -VASc components varied between genders. 2600 (71.4%) were on oral anticoagulant (OA). During a median follow up of 696days (124.23), all-cause mortality was higher in ≥80 (1011/3640 (27.8%) vs 350/4350 (8.05%) (p<0.001). There were differences in rate of thromboembolic (TE) and haemorrhagic events (2.3% vs 0.9%, p<0.01 and 2.5% vs 1.7%, p=0.01 respectively). In octogenarian, differences between genders were observed with regard to TE, but not in haemorrhagic or all-cause mortality rates. Age, heart failure, non-valvular AF, dementia, and OA were independent predictors of all-cause mortality. In regard to TE, female gender, hypertension, previous TE and OA were independent predictive factors. Octogenarians with AF had very different characteristics and outcomes from their younger counterparts. These results also provide reassurance about the effectiveness of OA in preventing TE events and maintaining a reasonable haemorrhagic event rate in the extremely elderly. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. A multicenter, retrospective chart review study comparing index therapy change rates in open-angle glaucoma or ocular hypertension patients newly treated with latanoprost or travoprost-Z monotherapy.

    PubMed

    Fain, Joel M; Kotak, Sameer; Mardekian, Jack; Bacharach, Jason; Edward, Deepak P; Rauchman, Steven; Brevetti, Teresa; Fox, Janet L; Lovelace, Cherie

    2011-06-13

    Because latanoprost and the original formulation of travoprost that included benzalkonium chloride (BAK) have been shown to be similar with regard to tolerability, we compared initial topical intraocular pressure (IOP)-lowering medication change rates in patients newly treated with latanoprost or travoprost-Z monotherapy. At 14 clinical practice sites, medical records were abstracted for patients with a diagnosis of open-angle glaucoma or ocular hypertension and who were ≥40 years of age, had a baseline and at least one follow-up visit, and had no prior history of ocular prostaglandin use. Data regarding demographics, ocular/systemic medical histories, clinical variables, therapy initiations and reasons for changes, adverse events, and resource utilization were recorded from randomly chosen eligible charts. Primary outcomes were rates of and reasons for changing from the initial therapy within six months and across the full study period (1000 days). Data from 900 medical charts (latanoprost, 632; travoprost-Z, 268) were included. For both cohorts, average follow-up was >1 year. Cohorts were similar with regard to age (median ~67 years), gender distribution (>50% female), and diagnosis (~80% with open-angle glaucoma). Within six months, rates of index therapy change for latanoprost versus travoprost-Z were 21.2% (134/632) and 28.7% (77/268), respectively (p = 0.0148); across the full study period, rates were 34.5% (218/632) and 45.2% (121/268), respectively (p = 0.0026). Among those who changed their index therapy, insufficient IOP control was the most commonly reported reason followed by adverse events; hyperemia was the most commonly reported adverse event at index therapy change. In this "real world" study of changes in therapy in patients prescribed initial monotherapy with latanoprost with BAK or travoprost-Z with SofZia, medication changes were common in both treatment groups but statistically significantly more frequent with travoprost-Z.

  10. Millennial-scale precipitation variability over Easter Island (South Pacific) during MIS 3: inter-hemispheric teleconnections with North Atlantic abrupt cold events

    NASA Astrophysics Data System (ADS)

    Margalef, O.; Cacho, I.; Pla-Rabes, S.; Cañellas-Boltà, N.; Pueyo, J. J.; Sáez, A.; Pena, L. D.; Valero-Garcés, B. L.; Rull, V.; Giralt, S.

    2015-04-01

    Marine Isotope Stage 3 (MIS 3, 59.4-27.8 kyr BP) is characterized by the occurrence of rapid millennial-scale climate oscillations known as Dansgaard-Oeschger cycles (DO) and by abrupt cooling events in the North Atlantic known as Heinrich events. Although both the timing and dynamics of these events have been broadly explored in North Atlantic records, the response of the tropical and subtropical latitudes to these rapid climatic excursions, particularly in the Southern Hemisphere, still remains unclear. The Rano Aroi peat record (Easter Island, 27° S) provides a unique opportunity to understand atmospheric and oceanic changes in the South Pacific during these DO cycles because of its singular location, which is influenced by the South Pacific Anticyclone (SPA), the Southern Westerlies (SW), and the Intertropical Convergence Zone (ITCZ) linked to the South Pacific Convergence Zone (SPCZ). The Rano Aroi sequence records 6 major events of enhanced precipitation between 38 and 65 kyr BP. These events are compared with other hydrological records from the tropical and subtropical band supporting a coherent regional picture, with the dominance of humid conditions in Southern Hemisphere tropical band during Heinrich Stadials (HS) 5, 5a and 6 and other Stadials while dry conditions prevailed in the Northern tropics. This antiphased hydrological pattern between hemispheres has been attributed to ITCZ migration, which in turn might be associated with an eastward expansion of the SPCZ storm track, leading to an increased intensity of cyclogenic storms reaching Easter Island. Low Pacific Sea Surface Temperature (SST) gradients across the Equator were coincident with the here-defined Rano Aroi humid events and consistent with a reorganization of Southern Pacific atmospheric and oceanic circulation also at higher latitudes during Heinrich and Dansgaard-Oeschger stadials.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewing, J.I.; Meyer, R.P.

    The Rivera Ocean Seismic Experiment (ROSE) was designed as a combined sea and land seismic program to utilize both explosive and earthquakes to study a number of features of the structure and evolution of a mid-ocean ridge, a major oceanic fracture zone, and the transition region between ocean and continent. The primary region selected for the experiment included the Rivera Fracture Zone, the crest and eastern flank of the East Pacific north of the Rivera and adjacent areas of Baja California and mainland Mexico. The experiment included: (1) study of the East Pacific Rise south of the Orozco Fracture Zonemore » primarily using ocean bottom recording and explosive sources. (2) a seismicity program at the Orozco, and (3) a land-based program of recording natural events along the coastal region of Mexico. A considerable amount of useful data was obtained in each of the three subprograms. In the marine parts of the experiment we were able to address a variety of problems including structure and evolution of young oceanic crust and mantle, structure and dynamics of the East Pacific Rise, seismicity of the Orozco Fracture Zone, and partitioning of energy transmission between the ocean volume and the crust/lithosphere. On land, the fortuitous occurrence of the Petatlan M7.6 earthquake of March 14, 1979, permitted the acquisition of an excellent data set of foreshocks and aftershocks of this large event, which provide new insight into the filling of a major seismic gap in the region. This overview describes the scientific rationale and the design of the experiments, along with some general results.« less

  12. Neural network pattern recognition of lingual-palatal pressure for automated detection of swallow.

    PubMed

    Hadley, Aaron J; Krival, Kate R; Ridgel, Angela L; Hahn, Elizabeth C; Tyler, Dustin J

    2015-04-01

    We describe a novel device and method for real-time measurement of lingual-palatal pressure and automatic identification of the oral transfer phase of deglutition. Clinical measurement of the oral transport phase of swallowing is a complicated process requiring either placement of obstructive sensors or sitting within a fluoroscope or articulograph for recording. Existing detection algorithms distinguish oral events with EMG, sound, and pressure signals from the head and neck, but are imprecise and frequently result in false detection. We placed seven pressure sensors on a molded mouthpiece fitting over the upper teeth and hard palate and recorded pressure during a variety of swallow and non-swallow activities. Pressure measures and swallow times from 12 healthy and 7 Parkinson's subjects provided training data for a time-delay artificial neural network to categorize the recordings as swallow or non-swallow events. User-specific neural networks properly categorized 96 % of swallow and non-swallow events, while a generalized population-trained network was able to properly categorize 93 % of swallow and non-swallow events across all recordings. Lingual-palatal pressure signals are sufficient to selectively and specifically recognize the initiation of swallowing in healthy and dysphagic patients.

  13. Retention of autobiographical memories: an Internet-based diary study.

    PubMed

    Kristo, Gert; Janssen, Steve M J; Murre, Jaap M J

    2009-11-01

    In this online study we examined the retention of recent personal events using an Internet-based diary technique. Each participant (N=878) recorded on a website one recent personal event and was contacted after a retention interval that ranged between 2 and 46 days. We investigated how well the participants could recall the content, time, and details of their recorded event. We found a classic retention function. Details of the events were forgotten more rapidly than the content and the time of the events. There were no differences between the forgetting rates of the "who", "what" and "where" elements of the content component. Reminiscing, social sharing, pleasantness, and frequency of occurrence aided recall, but surprisingly importance and emotionality did not. They were, however, strongly associated with reminiscing and social sharing.

  14. Mayo Registry for Telemetry Efficacy in Arrest Study: An Assessment of the Utility of Telemetry in Predicting Clinical Decompensation.

    PubMed

    Snipelisky, David; Ray, Jordan; Matcha, Gautam; Roy, Archana; Harris, Dana; Bosworth, Veronica; Dumitrascu, Adrian; Clark, Brooke; Vadeboncoeur, Tyler; Kusumoto, Fred; Bowman, Cammi; Burton, M Caroline

    2018-03-01

    Our study assesses the utility of telemetry in identifying decompensation in patients with documented cardiopulmonary arrest. A retrospective review of inpatients who experienced a cardiopulmonary arrest from May 1, 2008, until June 30, 2014, was performed. Telemetry records 24 hours prior to and immediately preceding cardiopulmonary arrest were reviewed. Patient subanalyses based on clinical demographics were made as well as analyses of survival comparing patients with identifiable rhythm changes in telemetry to those without. Of 242 patients included in the study, 75 (31.0%) and 110 (45.5%) experienced telemetry changes at the 24-hour and immediately preceding time periods, respectively. Of the telemetry changes, the majority were classified as nonmalignant (n = 50, 66.7% and n = 66, 55.5% at 24 hours prior and immediately preceding, respectively). There was no difference in telemetry changes between intensive care unit (ICU) and non-ICU patients and among patients stratified according to the American Heart Association telemetry indications. There was no difference in survival when comparing patients with telemetry changes immediately preceding and at 24 hours prior to an event (n = 30, 27.3% and n = 15, 20.0%) to those without telemetry changes during the same periods (n = 27, 20.5% and n = 42, 25.2%; P = .22 and .39). Telemetry has limited utility in predicting clinical decompensation in the inpatient setting.

  15. An Active RBSE Framework to Generate Optimal Stimulus Sequences in a BCI for Spelling

    NASA Astrophysics Data System (ADS)

    Moghadamfalahi, Mohammad; Akcakaya, Murat; Nezamfar, Hooman; Sourati, Jamshid; Erdogmus, Deniz

    2017-10-01

    A class of brain computer interfaces (BCIs) employs noninvasive recordings of electroencephalography (EEG) signals to enable users with severe speech and motor impairments to interact with their environment and social network. For example, EEG based BCIs for typing popularly utilize event related potentials (ERPs) for inference. Presentation paradigm design in current ERP-based letter by letter typing BCIs typically query the user with an arbitrary subset characters. However, the typing accuracy and also typing speed can potentially be enhanced with more informed subset selection and flash assignment. In this manuscript, we introduce the active recursive Bayesian state estimation (active-RBSE) framework for inference and sequence optimization. Prior to presentation in each iteration, rather than showing a subset of randomly selected characters, the developed framework optimally selects a subset based on a query function. Selected queries are made adaptively specialized for users during each intent detection. Through a simulation-based study, we assess the effect of active-RBSE on the performance of a language-model assisted typing BCI in terms of typing speed and accuracy. To provide a baseline for comparison, we also utilize standard presentation paradigms namely, row and column matrix presentation paradigm and also random rapid serial visual presentation paradigms. The results show that utilization of active-RBSE can enhance the online performance of the system, both in terms of typing accuracy and speed.

  16. Creating a High-Touch Recruitment Event: Utilizing Faculty to Recruit and Yield Students

    ERIC Educational Resources Information Center

    Freed, Lindsey R.; Howell, Leanne L.

    2018-01-01

    The following article describes the planning and implementation of a university student recruitment event that produced a high (new) student yield. Detailed descriptions of how staff and faculty worked together to plan and implement this event are described.

  17. Third Quarter Hanford Seismic Report for Fiscal Year 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.

    2009-09-30

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 771 local earthquakes during the third quarter of FY 2009. Nearly all of these earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this quarter is a continuation of the swarm events observed during the January – March 2009 time period and reported in the previous quarterly report (Rohay et al, 2009). The frequency of Wooded Island events has subsided with 16 events recorded during June 2009. Most of the events were considered minor (magnitude (Mc) less than 1.0) with 25 events in the 2.0-3.0 range. The estimated depths of the Wooded Island events are shallow (averaging less than 1.0 km deep) with a maximum depth estimated at 2.2 km. This places the Wooded Island events within the Columbia River Basalt Group (CRBG). The low magnitude of the Wooded Island events has made them undetectable to all but local area residents. However, some Hanford employees working within a few miles of the area of highest activity and individuals living in homes directly across the Columbia River from the swarm center have reported feeling many of the larger magnitude events. The Hanford Strong Motion Accelerometer (SMA) network was triggered numerous times by the Wooded Island swarm events. The maximum acceleration value recorded by the SMA network was approximately 3 times lower than the reportable action level for Hanford facilities (2% g) and no action was required. The swarming is likely due to pressure that has built up, cracking the brittle basalt layers within the Columbia River Basalt Formation (CRBG). Similar earthquake “swarms” have been recorded near this same location in 1970, 1975 and 1988. Prior to the 1970s, swarming may have occurred, but equipment was not in place to record those events. Quakes of this limited magnitude do not pose a risk to Hanford cleanup efforts or waste storage facilities. Since swarms of the past did not intensify in magnitude, seismologists do not expect that these events will increase in intensity. However, Pacific Northwest National Laboratory (PNNL) will continue to monitor the activity.« less

  18. Use of gastroprotection in patients discharged from hospital on nonsteroidal anti-inflammatory drugs.

    PubMed

    Coté, Gregory A; Norvell, John P; Rice, John P; Bulsiewicz, William J; Howden, Colin W

    2008-01-01

    Gastrointestinal (GI) hemorrhage is responsible for 200-400,000 hospitalizations in the United States annually. Nonsteroidal anti-inflammatory drugs (NSAIDs) are responsible for > or =30% of admissions due to GI hemorrhage. Misoprostol reduces the number of NSAID-related upper GI events while proton pump inhibitors (PPIs) reduce the incidence of endoscopic ulcers. To measure the utilization of GI prophylaxis in patients discharged from hospital on ulcerogenic medicines. We performed a medical record review of all 480 patients discharged from the medical service over a 3-month period on aspirin or nonaspirin NSAIDs. Use of gastroprotection was recorded, particularly among those patients not previously prescribed a PPI or misoprostol. Patients with a different indication for PPI therapy were excluded. In all, 480 patients were identified, and 142 were excluded. Of the 338 remaining patients, 154 (46%) were prescribed GI prophylaxis. In particular, 240 patients had not been receiving a PPI or misoprostol at the time of admission (gastroprotection naive). Of these, 23.3% received a new prescription for GI prophylaxis at discharge. Use of gastroprotection increased among patients older than 60 years compared with those 60 years and younger (P = 0.008), but there was no difference among patients with higher baseline comorbidity or those receiving multiple agents of interest. Although hospitalization offers an opportunity to recognize patients at high risk of developing upper GI complications from NSAIDs, utilization of appropriate gastroprotection seemed suboptimal. Educational efforts directed at physicians may help them recognize risk factors for GI hemorrhage and current indications for prophylaxis.

  19. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    NASA Astrophysics Data System (ADS)

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.

    2016-04-01

    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric hysteresis index varied over time in all three catchments. The exact response was catchment specific reflecting changing sediment availability and connectivity through time as indicated by dominant controls. In the low-permeability grassland catchment, proximal sources dominated which was consistent with observations of active channel bank erosion. Seasonal increases in rainfall increased the erosion potential but continuous grassland cover mitigated against hillslope sediment contributions despite high hydrological connectivity and surface pathways. The moderate-permeability arable catchment was dominated by events with a distal source component but those with both proximal and distal sediment sources yielded the highest sediment quantities. These events were driven by rainfall parameters suggesting sediment were surface derived and the hillslope was hydrologically connected during most events. Through time, a sustained period of rainfall increased the magnitude of negative hysteresis, likely demonstrating increasing surface hydrological connectivity due to increased groundwater saturation. Where increased hydrological connectivity coincided with low groundcover, the largest sediment exports were recorded. Events in the high permeability catchment indicated predominantly proximal sediments despite abundant distal sources from tilled fields. The infiltration dominated high permeability soils hydrologically disconnected these field sources and limited sediment supply. However, the greatest sediment export occurred in this catchment suggesting thresholds existed, which when exceeded during higher magnitude events, resulted in efficient conveyance of sediments. Hysteresis analysis offers wider utility as a tool to understand sediment pathways and connectivity issues with applications to catchment management strategies.

  20. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

Top