The Future of the Perfusion Record: Automated Data Collection vs. Manual Recording
Ottens, Jane; Baker, Robert A.; Newland, Richard F.; Mazzone, Annette
2005-01-01
Abstract: The perfusion record, whether manually recorded or computer generated, is a legal representation of the procedure. The handwritten perfusion record has been the most common method of recording events that occur during cardiopulmonary bypass. This record is of significant contrast to the integrated data management systems available that provide continuous collection of data automatically or by means of a few keystrokes. Additionally, an increasing number of monitoring devices are available to assist in the management of patients on bypass. These devices are becoming more complex and provide more data for the perfusionist to monitor and record. Most of the data from these can be downloaded automatically into online data management systems, allowing more time for the perfusionist to concentrate on the patient while simultaneously producing a more accurate record. In this prospective report, we compared 17 cases that were recorded using both manual and electronic data collection techniques. The perfusionist in charge of the case recorded the perfusion using the manual technique while a second perfusionist entered relevant events on the electronic record generated by the Stockert S3 Data Management System/Data Bahn (Munich, Germany). Analysis of the two types of perfusion records showed significant variations in the recorded information. Areas that showed the most inconsistency included measurement of the perfusion pressures, flow, blood temperatures, cardioplegia delivery details, and the recording of events, with the electronic record superior in the integrity of the data. In addition, the limitations of the electronic system were also shown by the lack of electronic gas flow data in our hardware. Our results confirm the importance of accurate methods of recording of perfusion events. The use of an automated system provides the opportunity to minimize transcription error and bias. This study highlights the limitation of spot recording of perfusion events in the overall record keeping for perfusion management. PMID:16524151
Code of Federal Regulations, 2013 CFR
2013-10-01
... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...
Code of Federal Regulations, 2010 CFR
2010-10-01
... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...
Code of Federal Regulations, 2012 CFR
2012-10-01
... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...
Code of Federal Regulations, 2014 CFR
2014-10-01
... OF TRANSPORTATION EVENT DATA RECORDERS § 563.1 Scope. This part specifies uniform, national requirements for vehicles equipped with event data recorders (EDRs) concerning the collection, storage, and retrievability of onboard motor vehicle crash event data. It also specifies requirements for vehicle...
Using weather data to determine dry and wet periods relative to ethnographic records
NASA Astrophysics Data System (ADS)
Felzer, B. S.; Jiang, M.; Cheng, R.; Ember, C. R.
2017-12-01
Ethnographers record flood or drought events that affect a society's food supply and can be interpreted in terms of a society's ability to adapt to extreme events. Using daily weather station data from the Global Historical Climatology Network for wet events, and monthly gridded climatic data from the Climatic Research Unit for drought events, we determine if it is possible to relate these measured data to the ethnographic records. We explore several drought and wetness indices based on temperature and precipitation, as well as the Colwell method to determine the predictability, seasonality, and variability of these extreme indices. Initial results indicate that while it is possible to capture the events recorded in the ethnographic records, there are many more "false" captures of events that are not recorded in these records. Although extreme precipitation is a poor indicator of floods due to antecedent moisture conditions, even using streamflow for selected sites produces false captures. Relating drought indices to actual food supply as measured in crop yield only related to minimum crop yield in half the cases. Further mismatches between extreme precipitation and drought indices and ethnographic records may relate to the fact that only extreme events that affect food supply are recorded in the ethnographic records or that not all events are recorded by the ethnographers. We will present new results on how predictability measures relate to the ethnographic disasters. Despite the highlighted technical challenges, our results provide a historic perspective linking environmental stressors with socio-economic impacts, which in turn, will underpin the current efforts of risk assessment in a changing environment.
Lee, Matthew J; Doody, Kevin; Mohamed, Khalid M S; Butler, Audrey; Street, John; Lenehan, Brian
2018-02-15
A study in 2011 by (Doody et al. Ir Med J 106(10):300-302, 2013) looked at comparing inpatient adverse events recorded prospectively at the point of care, with adverse events recorded by the national Hospital In-Patient Enquiry (HIPE) System. In the study, a single-centre University Hospital in Ireland treating acute hip fractures in an orthopaedic unit recorded 39 patients over a 2-month (August-September 2011) period, with 55 adverse events recorded prospectively in contrast to the HIPE record of 13 (23.6%) adverse events. With the recent change in the Irish hospital funding model from block grant to an 'activity-based funding' on the basis of case load and case complexity, the hospital financial allocation is dependent on accurate case complexity coding. A retrospective assessment of the financial implications of the two methods of adverse incident recording was carried out. A total of €39,899 in 'missed funding' for 2 months was calculated when the ward-based, prospectively collected data was compared to the national HIPE data. Accurate data collection is paramount in facilitating activity-based funding, to improve patient care and ensure the appropriate allocation of resources.
Analysis of event data recorder data for vehicle safety improvement
DOT National Transportation Integrated Search
2008-04-01
The Volpe Center performed a comprehensive engineering analysis of Event Data Recorder (EDR) data supplied by the National Highway Traffic Safety Administration (NHTSA) to assess its accuracy and usefulness in crash reconstruction and improvement of ...
77 FR 59566 - Event Data Recorders
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-28
... DEPARTMENT OF TRANSPORTATION National Highway Traffic Safety Administration 49 CFR Part 563 [Docket No. NHTSA-2012-0099] RIN 2127-AL14 Event Data Recorders Correction In rule document 2012-19580.... 563.8 Data format [Corrected] On page 47557 in the table titled ``Table III--Reported Data Element...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 6 2010-10-01 2010-10-01 false Data capture. 563.9 Section 563.9 Transportation..., DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.9 Data capture. The EDR must capture and record the data elements for events in accordance with the following conditions and circumstances: (a) In a...
49 CFR 563.11 - Information in owner's manual.
Code of Federal Regulations, 2011 CFR
2011-10-01
... law enforcement, could combine the EDR data with the type of personally identifying data routinely... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.11 Information in owner's... statement in English: This vehicle is equipped with an event data recorder (EDR). The main purpose of an EDR...
49 CFR 563.11 - Information in owner's manual.
Code of Federal Regulations, 2012 CFR
2012-10-01
... law enforcement, could combine the EDR data with the type of personally identifying data routinely... SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.11 Information in owner's... statement in English: This vehicle is equipped with an event data recorder (EDR). The main purpose of an EDR...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 6 2011-10-01 2011-10-01 false Data capture. 563.9 Section 563.9 Transportation..., DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.9 Data capture. Link to an amendment published at 76 FR 47489, Aug. 5, 2011. The EDR must capture and record the data elements for events in accordance...
SCADA data and the quantification of hazardous events for QMRA.
Nilsson, P; Roser, D; Thorwaldsdotter, R; Petterson, S; Davies, C; Signor, R; Bergstedt, O; Ashbolt, N
2007-01-01
The objective of this study was to assess the use of on-line monitoring to support the QMRA at water treatment plants studied in the EU MicroRisk project. SCADA data were obtained from three Catchment-to-Tap Systems (CTS) along with system descriptions, diary records, grab sample data and deviation reports. Particular attention was paid to estimating hazardous event frequency, duration and magnitude. Using Shewart and CUSUM we identified 'change-points' corresponding to events of between 10 min and >1 month duration in timeseries data. Our analysis confirmed it is possible to quantify hazardous event durations from turbidity, chlorine residual and pH records and distinguish them from non-hazardous variability in the timeseries dataset. The durations of most 'events' were short-term (0.5-2.3 h). These data were combined with QMRA to estimate pathogen infection risk arising from such events as chlorination failure. While analysis of SCADA data alone could identify events provisionally, its interpretation was severely constrained in the absence of diary records and other system information. SCADA data analysis should only complement traditional water sampling, rather than replace it. More work on on-line data management, quality control and interpretation is needed before it can be used routinely for event characterization.
Real-world perceptions of emerging event data recorder (EDR) technologies
DOT National Transportation Integrated Search
2002-01-01
This research focuses on what college-age motorists perceive to be the positive and negative aspects of implementing on-board Event Data Recorders (EDRs) in the highway mode of transport. The achievements and findings offer safety researchers insi...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
... video event recorders by May 18, 2011. The Agency will evaluate any data submitted and, if adverse... the placement of video event recorders at the top of the windshields on commercial motor vehicles (CMVs). CMVs may continue to use the video event recorders to increase safety through (1) identification...
DOT National Transportation Integrated Search
2004-12-01
The U.S. DOT has conducted research on the requirements for a Crash Event Data Recorder to facilitate the reconstruction of commercial motor vehicle crashes. This report documents the work performed on the Development of Requirements and Functiona...
Lee, Matthew J; Mohamed, Khalid M S; Kelly, John C; Galbraith, John G; Street, John; Lenehan, Brian J
2017-09-01
In Ireland, funding of joint arthroplasty procedures has moved to a pay-by-results national tariff system. Typically, adverse clinical events are recorded via retrospective chart-abstraction methods by administrative staff. Missed or undocumented events not only affect the quality of patient care but also may unrealistically skew budgetary decisions that impact fiscal viability of the service. Accurate recording confers clinical benefits and financial transparency. The aim of this study was to compare a prospectively implemented adverse events form with the current national retrospective chart-abstraction method in terms of pay-by-results financial implications. An adverse events form adapted from a similar validated model was used to prospectively record complications in 51 patients undergoing total hip or knee arthroplasties. Results were compared with the same cohort using an existing data abstraction method. Both data sets were coded in accordance with current standards for case funding. Overall, 114 events were recorded during the study through prospective charting of adverse events, compared with 15 events documented by customary method (a significant discrepancy). Wound drainage (15.8%) was the most common complication, followed by anemia (7.9%), lower respiratory tract infections (7.9%), and cardiac events (7%). A total of €61,956 ($67,778) in missed funding was calculated as a result. This pilot study demonstrates the ability to improve capture of adverse events through use of a well-designed assessment form. Proper perioperative data handling is a critical aspect of financial subsidies, enabling optimal allocation of funds. Copyright © 2017 Elsevier Inc. All rights reserved.
Simpao, Allan F; Pruitt, Eric Y; Cook-Sather, Scott D; Gurnaney, Harshad G; Rehman, Mohamed A
2012-12-01
Manual incident reports significantly under-report adverse clinical events when compared with automated recordings of intraoperative data. Our goal was to determine the reliability of AIMS and CQI reports of adverse clinical events that had been witnessed and recorded by research assistants. The AIMS and CQI records of 995 patients aged 2-12 years were analyzed to determine if anesthesia providers had properly documented the emesis events that were observed and recorded by research assistants who were present in the operating room at the time of induction. Research assistants recorded eight cases of emesis during induction that were confirmed with the attending anesthesiologist at the time of induction. AIMS yielded a sensitivity of 38 % (95 % confidence interval [CI] 8.5-75.5 %), while the sensitivity of CQI reporting was 13 % (95 % CI 0.3-52.7 %). The low sensitivities of the AIMS and CQI reports suggest that user-reported AIMS and CQI data do not reliably include significant clinical events.
Video techniques and data compared with observation in emergency trauma care
Mackenzie, C; Xiao, Y
2003-01-01
Video recording is underused in improving patient safety and understanding performance shaping factors in patient care. We report our experience of using video recording techniques in a trauma centre, including how to gain cooperation of clinicians for video recording of their workplace performance, identify strengths of video compared with observation, and suggest processes for consent and maintenance of confidentiality of video records. Video records are a rich source of data for documenting clinician performance which reveal safety and systems issues not identified by observation. Emergency procedures and video records of critical events identified patient safety, clinical, quality assurance, systems failures, and ergonomic issues. Video recording is a powerful feedback and training tool and provides a reusable record of events that can be repeatedly reviewed and used as research data. It allows expanded analyses of time critical events, trauma resuscitation, anaesthesia, and surgical tasks. To overcome some of the key obstacles in deploying video recording techniques, researchers should (1) develop trust with video recorded subjects, (2) obtain clinician participation for introduction of a new protocol or line of investigation, (3) report aggregated video recorded data and use clinician reviews for feedback on covert processes and cognitive analyses, and (4) involve multidisciplinary experts in medicine and nursing. PMID:14645896
A substitution method to improve completeness of events documentation in anesthesia records.
Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis
2015-12-01
AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.
NASA Astrophysics Data System (ADS)
Dodds, S. F.; Mock, C. J.
2009-12-01
All available instrumental winter precipitation data for the Central Valley of California back to 1850 were digitized and analyzed to construct continuous time series. Many of these data, in paper or microfilm format, extend prior to modern National Weather Service Cooperative Data Program and Historical Climate Network data, and were recorded by volunteer observers from networks such as the US Army Surgeon General, Smithsonian Institution, and US Army Signal Service. Given incomplete individual records temporally, detailed documentary data from newspapers, personal diaries and journals, ship logbooks, and weather enthusiasts’ instrumental data, were used in conjunction with instrumental data to reconstruct precipitation frequency per month and season, continuous days of precipitation, and to identify anomalous precipitation events. Multilinear regression techniques, using surrounding stations and the relationships between modern and historical records, bridge timeframes lacking data and provided homogeneous nature of time series. The metadata for each station was carefully screened, and notes were made about any possible changes to the instrumentation, location of instruments, or an untrained observer to verify that anomalous events were not recorded incorrectly. Precipitation in the Central Valley varies throughout the entire region, but waterways link the differing elevations and latitudes. This study integrates the individual station data with additional accounts of flood descriptions through unique newspaper and journal data. River heights and flood extent inundating cities, agricultural lands, and individual homes are often recorded within unique documentary sources, which add to the understanding of flood occurrence within this area. Comparisons were also made between dam and levee construction through time and how waters are diverted through cities in natural and anthropogenically changed environments. Some precipitation that lead to flooding events that occur in the Central Valley in the mid-19th century through the early 20th century are more outstanding at some particular stations than the modern records include. Several years that are included in the study are 1850, 1862, 1868, 1878, 1881, 1890, and 1907. These flood years were compared to the modern record and reconstructed through time series and maps. Incorporating the extent and effects these anomalous events in future climate studies could improve models and preparedness for the future floods.
Baker, Ruth; Tata, Laila J; Kendrick, Denise; Orton, Elizabeth
2016-02-01
English national injury data collection systems are restricted to hospitalisations and deaths. With recent linkage of a large primary care database, the Clinical Practice Research Datalink (CPRD), with secondary care and mortality data, we aimed to assess the utility of linked data for injury research and surveillance by examining recording patterns and comparing incidence of common injuries across data sources. The incidence of poisonings, fractures and burns was estimated for a cohort of 2 147 853 0-24 year olds using CPRD linked to Hospital Episode Statistics (HES) and Office for National Statistics (ONS) mortality data between 1997 and 2012. Time-based algorithms were developed to identify incident events, distinguishing between repeat follow-up records for the same injury and those for a new event. We identified 42 985 poisoning, 185 517 fracture and 36 719 burn events in linked CPRD-HES-ONS data; incidence rates were 41.9 per 10 000 person-years (95% CI 41.4 to 42.4), 180.8 (179.8-181.7) and 35.8 (35.4-36.1), respectively. Of the injuries, 22 628 (53%) poisonings, 139 662 (75%) fractures and 33 462 (91%) burns were only recorded within CPRD. Only 16% of deaths from poisoning (n=106) or fracture (n=58) recorded in ONS were recorded within CPRD and/or HES records. None of the 10 deaths from burns were recorded in CPRD or HES records. It is essential to use linked primary care, hospitalisation and deaths data to estimate injury burden, as many injury events are only captured within a single data source. Linked routinely collected data offer an immediate and affordable mechanism for injury surveillance and analyses of population-based injury epidemiology in England. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Moran, S. C.; Malone, S. D.
2013-12-01
The May 18, 1980, eruption of Mount St. Helens (MSH) was an historic event, both for society and for the field of volcanology. However, our knowledge of the eruption and the precursory period leading up it is limited by the fact that most of the data, particularly seismic recordings, were not kept due to severe limitations in the amount of digital data that could be handled and stored using 1980 computer technology. Because of these limitations, only about 900 digital event files have been available for seismic studies of the March-May seismic sequence out of a total of more than 4,000 events that were counted using paper records. Fortunately, data from a subset of stations were also recorded continuously on a series of 24 analog 14-track IRIG magnetic tapes. We have recently digitized these tapes and time-corrected and cataloged the resultant digital data streams, enabling more in-depth studies of the (almost) complete pre-eruption seismic sequence using modern digital processing techniques. Of the fifteen seismic stations operating near MSH for at least a part of the two months between March 20 and May 18, six stations have relatively complete analog recordings. These recordings have gaps of minutes to days because of radio noise, poor tape quality, or missing tapes. In addition, several other stations have partial records. All stations had short-period vertical-component sensors with very limited dynamic range and unknown response details. Nevertheless, because the stations were at a range of distances and were operated at a range of gains, a variety of earthquake sizes were recorded on scale by at least one station, and therefore a much more complete understanding of the evolution of event types, sizes and character should be achievable. In our preliminary analysis of this dataset we have found over 10,000 individual events as recorded on stations 35-40 km from MSH, spanning a recalculated coda-duration magnitude range of ~1.5 to 4.1, including many M < 3.0 events that are not part of the PNSN catalog. The closest stations (2-7 km from the summit) recorded several times as many events as the more remote stations during the times they were operational, although many signals are clipped. We see a range of event types including long-period events, tremor, and occasional volcano-tectonic earthquakes. The latter group includes small volcano-tectonic events that occurred at depths of > 7 km during the crypto-dome intrusion phase, which were recognized in 1980 but not fully described. In our analysis of the hours to days prior to the May 18 eruption, we find no obvious changes in seismicity that could have been interpreted as a short-term precursor to the May 18 eruption initiation. This new dataset is currently being formatted for permanent archiving in the IRIS Data Management Center, where it will be available for anyone to use.
NASA Astrophysics Data System (ADS)
Ibsen, Maïa-Laura; Brunsden, Denys
1996-04-01
The purpose of this paper is to describe and evaluate the nature of the European historical archives which are suitable for the assessment of the temporal occurrence and forecasting within landslides studies, using the British south coast as an example. The paper is based upon the British contribution to the Environment programme EPOCH, 1991-1993. A primary requirement of a research programme on process occurrence is to determine the event frequencies on as many time and space scales as possible. Thus, the analysis of archives is, potentially, an essential preliminary to the study of the temporal occurrence of landslide events. The range of such data sources extends from isolated, fortuitously dated sites from the Quaternary assemblage, through inferred event impacts using dendrochronology or lichenometric time series to historical records of causal factors such as rainfall data and more recently, deliberately recorded packages of cumulative or continuous data. Most countries have extensive historical sources which may be of considerable value in establishing the characteristics of geomorphological processes. These include narrative in literature, prints and other artwork, terrestrial and aerial photographs, remote sensing series, newspapers, incidental statements and scientific journals and reports. These are numerous difficulties in accessing, extracting, organising, databasing and analysing such data because they are not usually collated for scientific use. Problems involve such incalculable errors as: the experience, training and conscientiousness of the observer; the editing and recording process; judging the validity of the data used and the haphazard nature of recorded events in time and space. Despite these difficulties, such data do yield a record which adds to the representative temporal sample as a level above some threshold reporting position. It therefore has potential for specific statistical analysis. An example of a reasonable temporal landslide record is the data base of the Ventnor complex on the Isle of Wight initially established in 1991 by Geomorphological Services Limited (GSL), now of Rendel Geotechnics, and supplemented by the collections of the first author. The record displays an increase in landslide events over the present century, due probably to increasing technology and awareness of hazard and the development of process geomorphology. However, the landslide record was subsequently correlated with the Ventnor precipitation series. This indicated that wet year sequences usually gave rise to significant landslide events. The increasing variability and number of rainfall events predicted by various climatic units, e.g. the Hadley Centre, may therefore indicate a fundamental increase in landslide events in the future.
NASA Astrophysics Data System (ADS)
Railsback, L. Bruce; Liang, Fuyuan; Brook, G. A.; Voarintsoa, Ny Riavo G.; Sletten, Hillary R.; Marais, Eugene; Hardt, Ben; Cheng, Hai; Edwards, R. Lawrence
2018-04-01
The climatic event between 4.2 and 3.9 ka BP known as the "4.2 ka event" is commonly considered to be a synchronous global drought that happened as one pulse. However, careful comparison of records from around the world shows that synchrony is possible only if the published chronologies of the various records are shifted to the extent allowed by the uncertainties of their age data, that several records suggest a two-pulsed event, and that some records suggest a wet rather than dry event. The radiometric ages constraining those records have uncertainties of several decades if not hundreds of years, and in some records the event is represented by only one or two analyses. This paper reports a new record from Stalagmite DP1 from northeastern Namibia in which high 230Th/232Th activity ratios allow small age uncertainties ranging between only 10-28 years, and the event is documented by more than 35 isotopic analyses and by petrographic observation of a surface of dissolution. The ages from Stalagmite DP1 combine with results from 11 other records from around the world to suggest an event centered at about 4.07 ka BP with bracketing ages of 4.15 to 3.93 ka BP. The isotopic and petrographic results suggest a two-pulsed wet event in northeastern Namibia, which is in the Southern Hemisphere's summer rainfall zone where more rain presumably fell with southward migration of the Inter-Tropical Convergence Zone as the result of cooling in the Northern Hemisphere. Comparison with other records from outside the region of dryness from the Mediterranean to eastern Asia suggests that multiple climatic zones similarly moved southward during the event, in some cases bringing wetter conditions that contradict the notion of global drought.
Readiness of the ATLAS detector: Performance with the first beam and cosmic data
NASA Astrophysics Data System (ADS)
Pastore, F.
2010-05-01
During 2008 the ATLAS experiment went through an intense period of preparation to have the detector fully commissioned for the first beam period. In about 30 h of beam time available to ATLAS in 2008 the systems went through a rapid setup sequence, from successfully recording the first bunch ever reaching ATLAS, to setting up the timing of the trigger system synchronous to the incoming single beams. The so-called splash events were recorded, where the beam was stopped on a collimator 140 m upstream of ATLAS, showering the experiment with millions of particles per beam shot. These events were found to be extremely useful for timing setup. After the stop of the beam operation, the experiment went through an extensive cosmic ray data taking campaign, recording more than 500 million cosmic ray events. These events have been used to make significant progress on the calibration and alignment of the detector. This paper describes the commissioning programme and the results obtained from both the single beam data and the cosmic data recorded in 2008.
EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.
Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan
2018-01-01
Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.
Stewart, C M; Newlands, S D; Perachio, A A
2004-12-01
Rapid and accurate discrimination of single units from extracellular recordings is a fundamental process for the analysis and interpretation of electrophysiological recordings. We present an algorithm that performs detection, characterization, discrimination, and analysis of action potentials from extracellular recording sessions. The program was entirely written in LabVIEW (National Instruments), and requires no external hardware devices or a priori information about action potential shapes. Waveform events are detected by scanning the digital record for voltages that exceed a user-adjustable trigger. Detected events are characterized to determine nine different time and voltage levels for each event. Various algebraic combinations of these waveform features are used as axis choices for 2-D Cartesian plots of events. The user selects axis choices that generate distinct clusters. Multiple clusters may be defined as action potentials by manually generating boundaries of arbitrary shape. Events defined as action potentials are validated by visual inspection of overlain waveforms. Stimulus-response relationships may be identified by selecting any recorded channel for comparison to continuous and average cycle histograms of binned unit data. The algorithm includes novel aspects of feature analysis and acquisition, including higher acquisition rates for electrophysiological data compared to other channels. The program confirms that electrophysiological data may be discriminated with high-speed and efficiency using algebraic combinations of waveform features derived from high-speed digital records.
Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles
2007-01-01
Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203
Etgen, Thorleif; Hochreiter, Manfred; Mundel, Markus; Freudenberger, Thomas
2013-07-01
Atrial fibrillation (AF) is the most frequent risk factor in ischemic stroke but often remains undetected. We analyzed the value of insertable cardiac event recorder in detection of AF in a 1-year cohort of patients with cryptogenic ischemic stroke. All patients with cryptogenic stroke and eligibility for oral anticoagulation were offered the insertion of a cardiac event recorder. Regular follow-up for 1 year recorded the incidence of AF. Of the 393 patients with ischemic stroke, 65 (16.5%) had a cryptogenic stroke, and in 22 eligible patients, an event recorder was inserted. After 1 year, in 6 of 22 patients (27.3%), AF was detected. These preliminary data show that insertion of cardiac event recorder was eligible in approximately one third of patients with cryptogenic stroke and detected in approximately one quarter of these patients new AF.
Natawidjaja, D.H.; Sieh, K.; Ward, S.N.; Cheng, H.; Edwards, R. Lawrence; Galetzka, J.; Suwargadi, B.W.
2004-01-01
We utilize coral microatolls in western Sumatra to document vertical deformation associated with subduction. Microatolls are very sensitive to fluctuations in sea level and thus act as natural tide gauges. They record not only the magnitude of vertical deformation associated with earthquakes (paleoseismic data), but also continuously track the long-term aseismic deformation that occurs during the intervals between earthquakes (paleogeodetic data). This paper focuses on the twentieth century paleogeodetic history of the equatorial region. Our coral paleogeodetic record of the 1935 event reveals a classical example of deformations produced by seismic rupture of a shallow subduction interface. The site closest to the trench rose 90 cm, whereas sites further east sank by as much as 35 cm. Our model reproduces these paleogeodetic data with a 2.3 m slip event on the interface 88 to 125 km from the trench axis. Our coral paleogeodetic data reveal slow submergence during the decades before and after the event in the areas of coseismic emergence. Likewise, interseismic emergence occurred before and after the 1935 event in areas of coseismic submergence. Among the interesting phenomenon we have discovered in the coral record is evidence of a large aseismic slip or "silent even" in 1962, 27 years after the 1935 event. Paleogeodetic deformation rates in the decades before, after, and between the 1935 and 1962 events have varied both temporally and spatially. During the 25 years following the 1935 event, submergence rates were dramatically greater than in prior decades. During the past four decades, however, rates have been lower than in the preceding decades, but are still higher than they were prior to 1935. These paleogeodetic records enable us to model the kinematics of the subduction interface throughout the twentieth century. Copyright 2004 by the American Geophysical Union.
Gorrell, Lindsay M; Engel, Roger M; Lystad, Reidar P; Brown, Benjamin T
2017-03-14
Reporting of adverse events in randomized clinical trials (RCTs) is encouraged by the authors of The Consolidated Standards of Reporting Trials (CONSORT) statement. With robust methodological design and adequate reporting, RCTs have the potential to provide useful evidence on the incidence of adverse events associated with spinal manipulative therapy (SMT). During a previous investigation, it became apparent that comprehensive search strategies combining text words with indexing terms was not sufficiently sensitive for retrieving records that were known to contain reports on adverse events. The aim of this analysis was to compare the proportion of articles containing data on adverse events associated with SMT that were indexed in MEDLINE and/or EMBASE and the proportion of those that included adverse event-related words in their title or abstract. A sample of 140 RCT articles previously identified as containing data on adverse events associated with SMT was used. Articles were checked to determine if: (1) they had been indexed with relevant terms describing adverse events in the MEDLINE and EMBASE databases; and (2) they mentioned adverse events (or any related terms) in the title or abstract. Of the 140 papers, 91% were MEDLINE records, 85% were EMBASE records, 81% were found in both MEDLINE and EMBASE records, and 4% were not in either database. Only 19% mentioned adverse event-related text words in the title or abstract. There was no significant difference between MEDLINE and EMBASE records in the proportion of available papers (p = 0.078). Of the 113 papers that were found in both MEDLINE and EMBASE records, only 3% had adverse event-related indexing terms assigned to them in both databases, while 81% were not assigned an adverse event-related indexing term in either database. While there was effective indexing of RCTs involving SMT in the MEDLINE and EMBASE databases, there was a failure of allocation of adverse event indexing terms in both databases. We recommend the development of standardized definitions and reporting tools for adverse events associated with SMT. Adequate reporting of adverse events associated with SMT will facilitate accurate indexing of these types of manuscripts in the databases.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... J2EE application that is platform independent and captures all information relating to Alternative Dispute Resolution case processing. It tracks, manages, and reports on all data, events, and procedures... records to indicate that it will be used: (1) To track, manage, and report on all data, events, and...
Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike
2013-01-01
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.
NASA Technical Reports Server (NTRS)
Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)
2007-01-01
A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.
Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias
2017-07-15
Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is possible and that the present method can provide a novel solution to analyse real-world fNIRS data, filling the gap between real-life testing and functional neuroimaging. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
The Monitoring Erosion of Agricultural Land and spatial database of erosion events
NASA Astrophysics Data System (ADS)
Kapicka, Jiri; Zizala, Daniel
2013-04-01
In 2011 originated in The Czech Republic The Monitoring Erosion of Agricultural Land as joint project of State Land Office (SLO) and Research Institute for Soil and Water Conservation (RISWC). The aim of the project is collecting and record keeping information about erosion events on agricultural land and their evaluation. The main idea is a creation of a spatial database that will be source of data and information for evaluation and modeling erosion process, for proposal of preventive measures and measures to reduce negative impacts of erosion events. A subject of monitoring is the manifestations of water erosion, wind erosion and slope deformation in which cause damaged agriculture land. A website, available on http://me.vumop.cz, is used as a tool for keeping and browsing information about monitored events. SLO employees carry out record keeping. RISWC is specialist institute in the Monitoring Erosion of Agricultural Land that performs keeping the spatial database, running the website, managing the record keeping of events, analysis the cause of origins events and statistical evaluations of keeping events and proposed measures. Records are inserted into the database using the user interface of the website which has map server as a component. Website is based on database technology PostgreSQL with superstructure PostGIS and MapServer UMN. Each record is in the database spatial localized by a drawing and it contains description information about character of event (data, situation description etc.) then there are recorded information about land cover and about grown crops. A part of database is photodocumentation which is taken in field reconnaissance which is performed within two days after notify of event. Another part of database are information about precipitations from accessible precipitation gauges. Website allows to do simple spatial analysis as are area calculation, slope calculation, percentage representation of GAEC etc.. Database structure was designed on the base of needs analysis inputs to mathematical models. Mathematical models are used for detailed analysis of chosen erosion events which include soil analysis. Till the end 2012 has had the database 135 events. The content of database still accrues and gives rise to the extensive source of data that is usable for testing mathematical models.
NASA Astrophysics Data System (ADS)
Kolmasova, I.; Santolik, O.; Spurny, P.; Borovicka, J.; Mlynarczyk, J.; Popek, M.; Lan, R.; Uhlir, L.; Diendorfer, G.; Slosiar, R.
2017-12-01
We present observations of transient luminous events (TLEs) produced by a small-scale winter thunderstorm which occurred on 2 April 2017 in the southwest of Czechia. Elves, sprites and associated positive lightning strokes have been simultaneously recorded by different observational techniques. Optical data include video recordings of TLEs from Nydek (Czechia) and data recorded by high time-resolution photometers at several stations of the Czech fireball network which measured the all-sky brightness originating from lightning return strokes. Electromagnetic data sets include 3-component VLF measurements conducted in Rustrel (France), 2-component ELF measurements recorded at the Hylaty station (Poland) and signal intensity variations of a VLF transmitter (DHO38, Rhauderfehn, Germany) recorded in Bojnice (Slovakia). Optical and electromagnetic data are completed by positions and peak currents of all strokes recorded during the observed thunderstorm by the EUCLID lightning detection network. We focus our analysis on positive lightning discharges with high peak currents and we compare properties of those which produced TLE with properties of discharges for which TLE was not detected. The current moment waveforms and charge moment changes associated with the TLE events are reconstructed from the ELF electromagnetic signals. Obtained current moment waveforms show excellent agreement with high time-resolution optical data.
USDA-ARS?s Scientific Manuscript database
The objectives of this research were to estimate variance components for 6 common health events recorded by producers on U.S. dairy farms, as well as investigate correlations with fitness traits currently used for selection. Producer-recorded health event data were available from Dairy Records Manag...
A new approach to data management and its impact on frequency control requirements
NASA Technical Reports Server (NTRS)
Blanchard, D. L.; Fuchs, A. J.; Chi, A. R.
1979-01-01
A new approach to data management consisting of spacecraft and data/information autonomy and its impact on frequency control requirements is presented. An autonomous spacecraft is capable of functioning without external intervention for up to 72 hr by enabling the sensors to make observations, maintaining its health and safety, and by using logical safety modes when anomalies occur. Data/information are made autonomous by associating all relevant ancillary data such as time, position, attitude, and sensor identification with the data/information record of an event onboard the spacecraft. This record is so constructed that the record of the event can be physically identified in a complete and self-contained record that is independent of all other data. All data within a packet will be time tagged to the needed accuracy, and the time markings from packet to packet will be coherent to a UTC time scale.
Predictive modeling of structured electronic health records for adverse drug event detection.
Zhao, Jing; Henriksson, Aron; Asker, Lars; Boström, Henrik
2015-01-01
The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two.
Predictive modeling of structured electronic health records for adverse drug event detection
2015-01-01
Background The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Methods Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Results Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and combined. Conclusions We have demonstrated how machine learning can be applied to electronic health records for the purpose of detecting adverse drug events and proposed solutions to some of the challenges this presents, including how to represent the various data types. Overall, clinical codes are more useful than measurements and, in specific cases, it is beneficial to combine the two. PMID:26606038
Lightning spectra at 100,000 fps
NASA Astrophysics Data System (ADS)
McHarg, M. G.; Harley, J.; Haaland, R. K.; Edens, H. E.; Stenbaek-Nielsen, H.
2016-12-01
A fundamental understanding of lightning can be inferred from the spectral emissions resulting from the leader and return stroke channel. We examine an event recorded at 00:58:07 on 19 July 2015 at Langmuir Laboratory. We recorded lightning spectra using a 100 line per mm grating in front of a Phantom V2010 camera with an 85mm Nikon lens recording at 100,000 frames per second. Coarse resolution spectra (approximately 5nm resolution) are produced from approximately 400 nm to 800 nm for each frame. Electric field data from the Langmuir Electric Field Array for the 03:19:19 event show 10 V/m changes in the electric field associated with multiple return strokes visible in the spectral data. We used the spectral data to compare temperatures at the top, middle and bottom of the lightning channel. Lightning Mapping Array data at Langmuir for the 00:58:07 event show a complex flash extending 10 km in the East-West plane and 6 km in the North-South plane. The imagery data imply that this is a bolt-from-the-blue event.
From IHE Audit Trails to XES Event Logs Facilitating Process Mining.
Paster, Ferdinand; Helm, Emmanuel
2015-01-01
Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
Ringler, A.T.; Gee, L.S.; Marshall, B.; Hutt, C.R.; Storm, T.
2012-01-01
Great earthquakes recorded across modern digital seismographic networks, such as the recent Tohoku, Japan, earthquake on 11 March 2011 (Mw = 9.0), provide unique datasets that ultimately lead to a better understanding of the Earth's structure (e.g., Pesicek et al. 2008) and earthquake sources (e.g., Ammon et al. 2011). For network operators, such events provide the opportunity to look at the performance across their entire network using a single event, as the ground motion records from the event will be well above every station's noise floor.
Code of Federal Regulations, 2013 CFR
2013-10-01
... frontal air bag deployment crash, capture and record the current deployment data. In a side or side curtain/tube air bag deployment crash, where lateral delta-V is recorded by the EDR, capture and record the current deployment data. The memory for the air bag deployment event must be locked to prevent any...
Code of Federal Regulations, 2014 CFR
2014-10-01
... frontal air bag deployment crash, capture and record the current deployment data. In a side or side curtain/tube air bag deployment crash, where lateral delta-V is recorded by the EDR, capture and record the current deployment data. The memory for the air bag deployment event must be locked to prevent any...
Code of Federal Regulations, 2012 CFR
2012-10-01
... frontal air bag deployment crash, capture and record the current deployment data. In a side or side curtain/tube air bag deployment crash, where lateral delta-V is recorded by the EDR, capture and record the current deployment data. The memory for the air bag deployment event must be locked to prevent any...
Strong-motion data from the two Pingtung, Taiwan, earthquakes of 26 December 2006
Wu, C.-F.; Lee, W.H.K.; Boore, D.M.
2008-01-01
1016 strong-motion records at 527 free-field stations and 131 records at 42 strong-motion arrays at buildings and bridges were obtained for the Pingtung earthquake doublet from the Taiwan Central Weather Bureau's dense, digital strong-motion network. We carried out standard processing of these strong-motion records at free-field stations. A data set, including the originally recorded files, processed data files, and supporting software and information, is archived online http:// tecdc.earth.sinica.edu.tw/data/EQ2006Pingtung/. We have not yet completed the processing of the strong-motion array data at buildings and bridges. However, some preliminary results and the strong-motion array data recorded at the second nearest instrumented building to the Pingtung earthquake doublet are shown. This paper is intended to document our data processing procedures and the online archived data files, so that researchers can efficiently use the data. We also include two preliminary analyses: (1) a comparison of ground motions recorded by multiple accelerographs at a common site, the TAP 117 station in Taipei, and (2) attenuation of the horizontal ground motions (peak acceleration and response spectra at periods of 0.2, 1.0, and 3.0 s) with respect to distance. Our comparison study of multiple recordings at TAP 117 indicates that waveform coherence among 20- and 24-bit accelerograph records is much higher as compared to records from 16-bit or 12-bit accelerographs, suggesting that the former are of better quality. For the 20- and 24-bit accelerographs, waveform coherence is nearly 1 over the frequency range 1 to 8 Hz for all components, and is greater than about 0.9 from 8 to 20 Hz for the horizontal component, but only from 8 to 12 Hz for the vertical component. Plots of pseudo-acceleration response spectra (PSA) as a function of distance, however, show no clear indication for a difference related to the performance level of the accelerographs. The ground-motions of the first event (Mw = 7.0) are comparable, or even somewhat lower, than those from the smaller second event (Mw = 6.9), consistent with the relative difference of the local magnitudes (ML = 6.96 and 6.99 for the first and second events, respectively). The ground motions from the first event are generally lower than those predicted from equations based on other in-slab subduction earthquakes, whereas the ground motions from the second event are closer to the predictions. Ground-motions for soil sites are generally larger than those from rock sites.
NASA Astrophysics Data System (ADS)
Flynn, J. William; Goodfellow, Sebastian; Reyes-Montes, Juan; Nasseri, Farzine; Young, R. Paul
2016-04-01
Continuous acoustic emission (AE) data recorded during rock deformation tests facilitates the monitoring of fracture initiation and propagation due to applied stress changes. Changes in the frequency and energy content of AE waveforms have been previously observed and were associated with microcrack coalescence and the induction or mobilisation of large fractures which are naturally associated with larger amplitude AE events and lower-frequency components. The shift from high to low dominant frequency components during the late stages of the deformation experiment, as the rate of AE events increases and the sample approaches failure, indicates a transition from the micro-cracking to macro-cracking regime, where large cracks generated result in material failure. The objective of this study is to extract information on the fracturing process from the acoustic records around sample failure, where the fast occurrence of AE events does not allow for identification of individual AE events and phase arrivals. Standard AE event processing techniques are not suitable for extracting this information at these stages. Instead the observed changes in the frequency content of the continuous record can be used to characterise and investigate the fracture process at the stage of microcrack coalescence and sample failure. To analyse and characterise these changes, a detailed non-linear and non-stationary time-frequency analysis of the continuous waveform data is required. Empirical Mode Decomposition (EMD) and Hilbert Spectral Analysis (HSA) are two of the techniques used in this paper to analyse the acoustic records which provide a high-resolution temporal frequency distribution of the data. In this paper we present the results from our analysis of continuous AE data recorded during a laboratory triaxial deformation experiment using the combined EMD and HSA method.
An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya
2016-04-01
Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche, etc. Keywords: Offline, Online, WebGIS Open source, Android, Hazard Event Mapping
Guinot, Guillaume; Adnet, Sylvain; Cappetta, Henri
2012-01-01
Modern selachians and their supposed sister group (hybodont sharks) have a long and successful evolutionary history. Yet, although selachian remains are considered relatively common in the fossil record in comparison with other marine vertebrates, little is known about the quality of their fossil record. Similarly, only a few works based on specific time intervals have attempted to identify major events that marked the evolutionary history of this group. Phylogenetic hypotheses concerning modern selachians' interrelationships are numerous but differ significantly and no consensus has been found. The aim of the present study is to take advantage of the range of recent phylogenetic hypotheses in order to assess the fit of the selachian fossil record to phylogenies, according to two different branching methods. Compilation of these data allowed the inference of an estimated range of diversity through time and evolutionary events that marked this group over the past 300 Ma are identified. Results indicate that with the exception of high taxonomic ranks (orders), the selachian fossil record is by far imperfect, particularly for generic and post-Triassic data. Timing and amplitude of the various identified events that marked the selachian evolutionary history are discussed. Some identified diversity events were mentioned in previous works using alternative methods (Early Jurassic, mid-Cretaceous, K/T boundary and late Paleogene diversity drops), thus reinforcing the efficiency of the methodology presented here in inferring evolutionary events. Other events (Permian/Triassic, Early and Late Cretaceous diversifications; Triassic/Jurassic extinction) are newly identified. Relationships between these events and paleoenvironmental characteristics and other groups' evolutionary history are proposed.
Rescue, Archival and Discovery of Tsunami Events on Marigrams
NASA Astrophysics Data System (ADS)
Eble, M. C.; Wright, L. M.; Stroker, K. J.; Sweeney, A.; Lancaster, M.
2017-12-01
The Big Earth Data Initiative made possible the reformatting of paper marigram records on which were recorded measurements of the 1946, 1952, 1960, and 1964 tsunamis generated in the Pacific Ocean. Data contained within each record were determined to be invaluable for tsunami researchers and operational agencies with a responsibility for issuing warnings during a tsunami event. All marigrams were carefully digitized and metadata were generated to form numerical datasets in order to provide the tsunami and other research and application-driven communities with quality data. Data were then packaged as CF-compliant netCDF datafiles and submitted to the NOAA Centers for Environmental Information for long-term stewardship, archival, and public discovery of both original scanned images and data in digital netCDF and CSC formats. The PNG plots of each time series were generated and included with data packages to provide a visual representation of the numerical data sets. ISO-compliant metadata were compiled for the collection at the event level and individual DOIs were minted for each of the four events included in this project. The procedure followed to reformat each record in this four-event subset of the larger NCEI scanned marigram inventory is presented and discussed. The practical use of these data is presented to highlight that even infrequent measurements of tsunamis hold information that may potentially help constrain earthquake rupture area, provide estimates of earthquake co-seismic slip distribution, identify subsidence or uplift, and significantly increase the holdings of situ data available for tsunami model validation. These same data may also prove valuable to the broader global tide community for validation and further development of tide models and for investigation into the stability of tidal harmonic constants. Data reformatted as part of this project are PARR compliant and meet the requirements for Data Management, Discoverability, Accessibility, Documentation, Readability, and Data Preservation and Stewardship as per the Big Earth Data Initiative.
Earthquake recording at the Stanford DAS Array with fibers in existing telecomm conduits
NASA Astrophysics Data System (ADS)
Biondi, B. C.; Martin, E. R.; Yuan, S.; Cole, S.; Karrenbach, M. H.
2017-12-01
The Stanford Distributed Acoustic Sensing Array (SDASA-1) has been continuously recording seismic data since September 2016 on 2.5 km of single mode fiber optics in existing telecommunications conduits under Stanford's campus. The array is figure-eight shaped and roughly 600 m along its widest side with a channel spacing of roughly 8 m. This array is easy to maintain and is nonintrusive, making it well suited to urban environments, but it sacrifices some cable-to-ground coupling compared to more traditional seismometers. We have been testing its utility for earthquake recording, active seismic, and ambient noise interferometry. This talk will focus on earthquake observations. We will show comparisons between the strain rates measured throughout the DAS array and the particle velocities measured at the nearby Jasper Ridge Seismic Station (JRSC). In some of these events, we will point out directionality features specific to DAS that can require slight modifications in data processing. We also compare repeatability of DAS and JRSC recordings of blasts from a nearby quarry. Using existing earthquake databases, we have created a small catalog of DAS earthquake observations by pulling records of over 700 Northern California events spanning Sep. 2016 to Jul. 2017 from both the DAS data and JRSC. On these events we have tested common array methods for earthquake detection and location including beamforming and STA/LTA analysis in time and frequency. We have analyzed these events to approximate thresholds on what distances and magnitudes are clearly detectible by the DAS array. Further analysis should be done on detectability with methods tailored to small events (for example, template matching). In creating this catalog, we have developed open source software available for free download that can manage large sets of continuous seismic data files (both existing files, and files as they stream in). This software can both interface with existing earthquake networks, and efficiently extract earthquake recordings from many continuous recordings saved on the users machines.
NASA Astrophysics Data System (ADS)
Haddleton, Graham P.
2001-04-01
In military research and development or testing there are various fast and dangerous events that need to be recorded and analyzed. High-speed cameras allow the capture of movement too fast to be recognized by the human eye, and provide data that is essential for the analysis and evaluation of such events. High-speed photography is often the only type of instrumentation that can be used to record the parameters demanded by our customers. I will show examples where this applied cinematography is used not only to provide a visual record of events, but also as an essential measurement tool.
Data Bookkeeping Service 3 - Providing Event Metadata in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giffels, Manuel; Guo, Y.; Riley, Daniel
The Data Bookkeeping Service 3 provides a catalog of event metadata for Monte Carlo and recorded data of the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN, Geneva. It comprises all necessary information for tracking datasets, their processing history and associations between runs, files and datasets, on a large scale of about 200, 000 datasets and more than 40 million files, which adds up in around 700 GB of metadata. The DBS is an essential part of the CMS Data Management and Workload Management (DMWM) systems [1], all kind of data-processing like Monte Carlo production,more » processing of recorded event data as well as physics analysis done by the users are heavily relying on the information stored in DBS.« less
Plant microfossil record of the terminal Cretaceous event in the western United States and Canada
NASA Technical Reports Server (NTRS)
Nichols, D. J.; Fleming, R. F.
1988-01-01
Plant microfossils, principally pollen grains and spores produced by land plants, provide an excellent record of the terminal Cretaceous event in nonmarine environments. The record indicates regional devastation of the latest Cretaceous vegetation with the extinction of many groups, followed by a recolonization of the earliest Tertiary land surface, and development of a permanently changed land flora. The regional variations in depositional environments, plant communities, and paleoclimates provide insight into the nature and effects of the event, which were short-lived but profound. The plant microfossil data support the hypothesis that an abruptly initiated, major ecological crisis occurred at the end of the Cretaceous. Disruption of the Late Cretaceous flora ultimately contributred to the rise of modern vegetation. The plant microfossils together with geochemical and mineralogical data are consistent with an extraterrestrial impact having been the cause of the terminal Cretaceous event.
NASA Astrophysics Data System (ADS)
Corman, J. R.; Loken, L. C.; Oliver, S. K.; Collins, S.; Butitta, V.; Stanley, E. H.
2017-12-01
Extreme events can play powerful roles in shifting ecosystem processes. In lakes, heavy rainfall can transport large amounts of particulates and dissolved nutrients into the water column and, potentially, alter biogeochemical cycling. However, the impacts of extreme rainfall events are often difficult to study due to a lack of long-term records. In this paper, we combine daily discharge records with long-term lake water quality information collected by the North Temperate Lakes Long-Term Ecological Research (NTL LTER) site to investigate the impacts of extreme events on nutrient cycling in lakes. We focus on Lake Mendota, an urban lake within the Yahara River Watershed in Madison, Wisconsin, USA, where nutrient data are available at least seasonally from 1995 - present. In June 2008, precipitation amounts in the Yahara watershed were 400% above normal values, triggering the largest discharge event on record for the 40 years of monitoring at the streamgage station; hence, we are able to compare water quality records before and after this event as a case study of how extreme rain events couple or decouple lake nutrient cycling. Following the extreme event, the lake-wide mass of nitrogen and phosphorus increased in the summer of 2008 by 35% and 21%, respectively, shifting lake stoichiometry by increasing N:P ratios (Figure 1). Nitrogen concentrations remained elevated longer than phosphorus, suggesting (1) that nitrogen inputs into the lake were sustained longer than phosphorus (i.e., a "smear" versus "pulse" loading of nitrogen versus phosphorus, respectively, in response to the extreme event) and/or (2) that in-lake biogeochemical processing was more efficient at removing phosphorus compared to nitrogen. While groundwater loading data are currently unavailable to test the former hypothesis, preliminary data from surficial nitrogen and phosphorus loading to Lake Mendota (available for 2011 - 2013) suggest that nitrogen removal efficiency is less than phosphorus, supporting the latter hypothesis. As climate change is expected to increase the frequency of extreme events, continued monitoring of lakes is needed to understand biogeochemical responses and when and how water quality threats may occur.
Sensor-Generated Time Series Events: A Definition Language
Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan
2012-01-01
There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.
Astronaut Health Participant Summary Application
NASA Technical Reports Server (NTRS)
Johnson, Kathy; Krog, Ralph; Rodriguez, Seth; Wear, Mary; Volpe, Robert; Trevino, Gina; Eudy, Deborah; Parisian, Diane
2011-01-01
The Longitudinal Study of Astronaut Health (LSAH) Participant Summary software captures data based on a custom information model designed to gather all relevant, discrete medical events for its study participants. This software provides a summarized view of the study participant s entire medical record. The manual collapsing of all the data in a participant s medical record into a summarized form eliminates redundancy, and allows for the capture of entire medical events. The coding tool could be incorporated into commercial electronic medical record software for use in areas like public health surveillance, hospital systems, clinics, and medical research programs.
Chaudoin, Ambre L.; Feuerbacher, Olin; Bonar, Scott A.; Barrett, Paul J.
2015-01-01
The monitoring of threatened and endangered fishes in remote environments continues to challenge fisheries biologists. The endangered Devils Hole Pupfish Cyprinodon diabolis, which is confined to a single warm spring in Death Valley National Park, California–Nevada, has recently experienced record declines, spurring renewed conservation and recovery efforts. In February–December 2010, we investigated the timing and frequency of spawning in the species' native habitat by using three survey methods: underwater videography, above-water videography, and in-person surveys. Videography methods incorporated fixed-position, solar-powered cameras to record continuous footage of a shallow rock shelf that Devils Hole Pupfish use for spawning. In-person surveys were conducted from a platform placed above the water's surface. The underwater camera recorded more overall spawning throughout the year (mean ± SE = 0.35 ± 0.06 events/sample) than the above-water camera (0.11 ± 0.03 events/sample). Underwater videography also recorded more peak-season spawning (March: 0.83 ± 0.18 events/sample; April: 2.39 ± 0.47 events/sample) than above-water videography (March: 0.21 ± 0.10 events/sample; April: 0.9 ± 0.32 events/sample). Although the overall number of spawning events per sample did not differ significantly between underwater videography and in-person surveys, underwater videography provided a larger data set with much less variability than data from in-person surveys. Fixed videography was more cost efficient than in-person surveys (\\$1.31 versus \\$605 per collected data-hour), and underwater videography provided more usable data than above-water videography. Furthermore, video data collection was possible even under adverse conditions, such as the extreme temperatures of the region, and could be maintained successfully with few study site visits. Our results suggest that self-contained underwater cameras can be efficient tools for monitoring remote and sensitive aquatic ecosystems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 6 2010-10-01 2010-10-01 false Data elements. 563.7 Section 563.7 Transportation..., DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.7 Data elements. (a) Data elements required for all vehicles. Each vehicle equipped with an EDR must record all of the data elements listed in Table I, during...
NASA Astrophysics Data System (ADS)
Nakatani, T.; Inamura, Y.; Moriyama, K.; Ito, T.; Muto, S.; Otomo, T.
Neutron scattering can be a powerful probe in the investigation of many phenomena in the materials and life sciences. The Materials and Life Science Experimental Facility (MLF) at the Japan Proton Accelerator Research Complex (J-PARC) is a leading center of experimental neutron science and boasts one of the most intense pulsed neutron sources in the world. The MLF currently has 18 experimental instruments in operation that support a wide variety of users from across a range of research fields. The instruments include optical elements, sample environment apparatus and detector systems that are controlled and monitored electronically throughout an experiment. Signals from these components and those from the neutron source are converted into a digital format by the data acquisition (DAQ) electronics and recorded as time-tagged event data in the DAQ computers using "DAQ-Middleware". Operating in event mode, the DAQ system produces extremely large data files (˜GB) under various measurement conditions. Simultaneously, the measurement meta-data indicating each measurement condition is recorded in XML format by the MLF control software framework "IROHA". These measurement event data and meta-data are collected in the MLF common storage and cataloged by the MLF Experimental Database (MLF EXP-DB) based on a commercial XML database. The system provides a web interface for users to manage and remotely analyze experimental data.
Hoffmire, Claire; Stephens, Brady; Morley, Sybil; Thompson, Caitlin; Kemp, Janet; Bossarte, Robert M
2016-11-01
The US Department of Veterans Affairs' Suicide Prevention Applications Network (SPAN) is a national system for suicide event tracking and case management. The objective of this study was to assess data on suicide attempts among people using Veterans Health Administration (VHA) services. We assessed the degree of data overlap on suicide attempters reported in SPAN and the VHA's medical records from October 1, 2010, to September 30, 2014-overall, by year, and by region. Data on suicide attempters in the VHA's medical records consisted of diagnoses documented with E95 codes from the International Classification of Diseases, Ninth Revision . Of 50 518 VHA patients who attempted suicide during the 4-year study period, data on fewer than half (41%) were reported in both SPAN and the medical records; nearly 65% of patients whose suicide attempt was recorded in SPAN had no data on attempted suicide in the VHA's medical records. Evaluation of administrative data suggests that use of SPAN substantially increases the collection of data on suicide attempters as compared with the use of medical records alone, but neither SPAN nor the VHA's medical records identify all suicide attempters. Further research is needed to better understand the strengths and limitations of both systems and how to best combine information across systems.
Identification of unusual events in multi-channel bridge monitoring data
NASA Astrophysics Data System (ADS)
Omenzetter, Piotr; Brownjohn, James Mark William; Moyo, Pilate
2004-03-01
Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure such as bridges. However, converting large amounts of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localising sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.
Juillard, Catherine; Kouo Ngamby, Marquise; Ekeke Monono, Martin; Etoundi Mballa, Georges Alain; Dicker, Rochelle A; Stevens, Kent A; Hyder, Adnan A
2017-12-01
Road traffic injury surveillance systems are a cornerstone of organized efforts at injury control. Although high-income countries rely on established trauma registries and police databases, in low- and middle-income countries, the data source that provides the best collection of road traffic injury events in specific low- and middle-income country contexts without mature surveillance systems is unclear. The objective of this study was to compare the information available on road traffic injuries in 3 data sources used for surveillance in the sub-Saharan African country of Cameroon, providing potential insight on data sources for road traffic injury surveillance in low- and middle-income countries. We assessed the number of events captured and the information available in Yaoundé, Cameroon, from 3 separate sources of data on road traffic injuries: trauma registry, police records, and newspapers. Data were collected from a single-hospital trauma registry, police records, and the 6 most widely circulated newspapers in Yaoundé during a 6-month period in 2009. The number of road traffic injury events, mortality, and other variables included commonly in injury surveillance systems were recorded. We compared these sources using descriptive analysis. Hospital, police, and newspaper sources recorded 1,686, 273, and 480 road traffic injuries, respectively. The trauma registry provided the most complete data for the majority of variables explored; however, the newspaper data source captured 2, mass casualty, train crash events unrecorded in the other sources. Police data provided the most complete information on first responders to the scene, missing in only 7%. Investing in the hospital-based trauma registry may yield the best surveillance for road traffic injuries in some low- and middle-income countries, such as Yaoundé, Cameroon; however, police and newspaper reports may serve as alternative data sources when specific information is needed. Copyright © 2017 Elsevier Inc. All rights reserved.
Optimal filter parameters for low SNR seismograms as a function of station and event location
NASA Astrophysics Data System (ADS)
Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.
1999-06-01
Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.
ERIC Educational Resources Information Center
Rapp, John T.; Carroll, Regina A.; Stangeland, Lindsay; Swanson, Greg; Higgins, William J.
2011-01-01
The authors evaluated the extent to which interobserver agreement (IOA) scores, using the block-by-block method for events scored with continuous duration recording (CDR), were higher when the data from the same sessions were converted to discontinuous methods. Sessions with IOA scores of 89% or less with CDR were rescored using 10-s partial…
NASA Astrophysics Data System (ADS)
Tibi, R.; Young, C. J.; Koper, K. D.; Pankow, K. L.
2017-12-01
Seismic event discrimination methods exploit the differing characteristics—in terms of amplitude and/or frequency content—of the generated seismic phases among the event types to be classified. Most of the commonly used seismic discrimination methods are designed for regional data recorded at distances of about 200 to 2000 km. Relatively little attention has focused on discriminants for local distances (< 200 km), the range at which the smallest events are recorded. Short-period fundamental mode Rayleigh waves (Rg) are commonly observed on seismograms of man-made seismic events, and shallow, naturally occurring tectonic earthquakes recorded at local distances. We leverage the well-known notion that Rg amplitude decreases dramatically with increasing event depth to propose a new depth discriminant based on Rg-to-Sg spectral amplitude ratios. The approach is successfully used to discriminate shallow events from deeper tectonic earthquakes in the Utah region recorded at local distances (< 150 km) by the University of Utah Seismographic Stations (UUSS) regional seismic network. Using Mood's median test, we obtained probabilities of nearly zero that the median Rg-to-Sg spectral amplitude ratios are the same between shallow events on one side (including both shallow tectonic earthquakes and man-made events), and deeper earthquakes on the other side, suggesting that there is a statistically significant difference in the estimated Rg-to-Sg ratios between the two populations. We also observed consistent disparities between the different types of shallow events (e.g., explosions vs. mining-induced events), implying that it may be possible to separate the sub-populations that make up this group. This suggests that using local distance Rg-to-Sg spectral amplitude ratios one can not only discriminate shallow from deeper events, but may also be able to discriminate different populations of shallow events. We also experimented with Pg-to-Sg amplitude ratios in multi-frequency linear discriminant functions to classify man-made events and tectonic earthquakes in Utah. Initial results are very promising, showing probabilities of misclassification of only 2.4-14.3%.
NASA Astrophysics Data System (ADS)
Halliday, I.; Blackwell, A. T.; Griffin, A. A.
1989-04-01
Photographic records of the meteoritic fireballs observed between 1971 and 1985 with the Canadian camera network were used to obtain essential data for those events that are believed to have dropped significant meteorites. The mass of the largest surviving fragment for each event was estimated from the dynamic data near the end of the photographic trail. In 28 of these events, the mass of the largest fragment was found to be larger than 0.5 kg; these included the recovered Innisfree meteorite and three events with mass of about 10 kg. Sixteen events had mass estimates from 0.1 to 0.5 kg. Twelve other events were in the smaller mass range. Data are presented on height, velocity, brightness, ground location, and the orbit for each of the 44 events in the mass range 0.1-10 kg. Special attention is given to the probable degree of fragmentation during flight and the effects of flight geometry and the upper atmospheric winds on the expected 'ellipse of fall' distribution on the ground.
Properties of Repetitive Long-Period Seismicity at Villarrica Volcano, Chile
NASA Astrophysics Data System (ADS)
Richardson, J.; Waite, G. P.; Palma, J.; Johnson, J. B.
2011-12-01
Villarrica Volcano, Chile hosts a persistent lava lake and is characterized by degassing and long-period seismicity. In order to better understand the relationship between outgassing and seismicity, we recorded broadband seismic and acoustic data along with high-rate SO2 emission data. We used both a densely-spaced linear array deployed on the northern flank of Villarrica, during the austral summer of 2011, and a wider aperture array of stations distributed around the volcano that was active in the austral summer of 2010. Both deployments consisted of three-component broadband stations and were augmented with broadband infrasound sensors. Of particular interests are repetitive, ~1 Hz seismic and coincident infrasound signals that occurred approximately every 2 minutes. Because these events are typically very low amplitude, we used a matched filter approach to identify them. We windowed several high-amplitude records of these events from broadband seismic stations near the vent. The record section of each event served as a template to compare with the entire dataset by cross-correlation. This approach identified ~20,000 nearly identical events during the ~7 day deployment of the linear array, which were otherwise difficult to identify in the raw records. Assuming that all of the events that we identified have identical source mechanisms and depths, we stack the large suite of events to produce low-noise records and particle motions at receivers farther than 5 km from the vent. We find that the records from stations near the edifice are dominated by tangential particle motion, suggesting the influence of near-field components. Correlation of these data with broadband acoustic data collected at the summit suggest that these repeatable seismic processes are linked to acoustic emissions, probably due to gas bubbles bursting at the magma free surface, as no eruptive products besides gas were being emitted by the volcano during the instrument deployment. The acoustic signals affiliated with the repetitive seismic signals do not seem directly related to the continuous, well-correlated acoustic tremor observed both at the vent and at roughly 6 km away from small-aperture acoustic arrays (also reported by other groups in 2009, 2010). We also correlate the acoustic and repetitive seismic signals with high time resolution (~1 Hz sampling rate), sulfur dioxide emissions measured with an ultraviolet camera. Because a subset of stations operated during both 2010 and 2011, we could tie events from both deployments to generate a single stacked event at all 17 stations. We will present results of finite-difference modeling of this event stack using a simple homogeneous velocity structure.
ERIC Educational Resources Information Center
Taylor, Matthew A.; Skourides, Andreas; Alvero, Alicia M.
2012-01-01
Interval recording procedures are used by persons who collect data through observation to estimate the cumulative occurrence and nonoccurrence of behavior/events. Although interval recording procedures can increase the efficiency of observational data collection, they can also induce error from the observer. In the present study, 50 observers were…
SEISMIC STUDY OF THE AGUA DE PAU GEOTHERMAL PROSPECT, SAO MIGUEL, AZORES.
Dawson, Phillip B.; Rodrigues da Silva, Antonio; Iyer, H.M.; Evans, John R.
1985-01-01
A 16 station array was operated over the 200 km**2 central portion of Sao Miguel utilizing 8 permanent Instituto Nacional de Meterologia e Geofisica stations and 8 USGS portable stations. Forty four local events with well constrained solutions and 15 regional events were located. In addition, hundreds of unlocatable seismic events were recorded. The most interesting seismic activity occurred in a swarm on September 6 and 7, 1983 when over 200 events were recorded in a 16 hour period. The seismic activity around Agua de Pau was centered on the east and northeast slopes of the volcano. The data suggest a boiling hydrothermal system beneath the Agua de Pau volcano, consistent with a variety of other data.
NASA Astrophysics Data System (ADS)
Bouchard, R.; Locke, L.; Hansen, W.; Collins, S.; McArthur, S.
2007-12-01
DART systems are a critical component of the tsunami warning system as they provide the only real-time, in situ, tsunami detection before landfall. DART systems consist of a surface buoy that serves as a position locater and communications transceiver and a Bottom Pressure Recorder (BPR) on the seafloor. The BPR records temperature and pressure at 15-second intervals to a memory card for later retrieval for analysis and use by tsunami researchers, but the BPRs are normally recovered only once every two years. The DART systems also transmit subsets of the data, converted to an estimation of the sea surface height, in near real-time for use by the tsunami warning community. These data are available on NDBC's webpages, http://www.ndbc.noaa.gov/dart.shtml. Although not of the resolution of the data recorded to the BPR memory card, the near real-time data have proven to be of value in research applications [1]. Of particular interest are the DART data associated with geophysical events. The DART BPR continuously compares the measured sea height with a predicted sea-height and when the difference exceeds a threshold value, the BPR goes into Event Mode. Event Mode provides an extended, more frequent near real-time reporting of the sea surface heights for tsunami detection. The BPR can go into Event Mode because of geophysical triggers, such as tsunamis or seismic activity, which may or may not be tsunamigenic. The BPR can also go into Event Mode during recovery of the BPR as it leaves the seafloor, or when manually triggered by the Tsunami Warning Centers in advance of an expected tsunami. On occasion, the BPR will go into Event Mode without any associated tsunami or seismic activity or human intervention and these are considered "False'' Events. Approximately one- third of all Events can be classified as "False". NDBC is responsible for the operations, maintenance, and data management of the DART stations. Each DART station has a webpage with a drop-down list of all Events. NDBC maintains the non-geophysical Events in order to maintain the continuity of the time series records. In 2007, NDBC compiled all DART Events that occurred while under NDBC's operational control and made an assessment on their validity. The NDBC analysts performed the assessment using the characteristics of the data time series, triggering criteria, and associated seismic events. The compilation and assessments are catalogued in a NDBC technical document. The Catalog also includes a listing of the one-hour, high-resolution data, retrieved remotely from the BPRs that are not available on the web pages. The Events are classified by their triggering mechanism and listed by station location and, for those Events associated with geophysical triggers, they are listed by their associated seismic events. The Catalog provides researchers with a valuable tool in locating, assessing, and applying near real-time DART data to tsunami research and will be updated following DART Events. A link to the published Catalog can be found on the NDBC DART website, http://www.ndbc.noaa.gov/dart.shtml. Reference: [1] Gower, J. and F. González (2006), U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10), 105-112.
Assessing natural hazard risk using images and data
NASA Astrophysics Data System (ADS)
Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.
2012-12-01
Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.
Peterfreund, Robert A; Driscoll, William D; Walsh, John L; Subramanian, Aparna; Anupama, Shaji; Weaver, Melissa; Morris, Theresa; Arnholz, Sarah; Zheng, Hui; Pierce, Eric T; Spring, Stephen F
2011-05-01
Efforts to assure high-quality, safe, clinical care depend upon capturing information about near-miss and adverse outcome events. Inconsistent or unreliable information capture, especially for infrequent events, compromises attempts to analyze events in quantitative terms, understand their implications, and assess corrective efforts. To enhance reporting, we developed a secure, electronic, mandatory system for reporting quality assurance data linked to our electronic anesthesia record. We used the capabilities of our anesthesia information management system (AIMS) in conjunction with internally developed, secure, intranet-based, Web application software. The application is implemented with a backend allowing robust data storage, retrieval, data analysis, and reporting capabilities. We customized a feature within the AIMS software to create a hard stop in the documentation workflow before the end of anesthesia care time stamp for every case. The software forces the anesthesia provider to access the separate quality assurance data collection program, which provides a checklist for targeted clinical events and a free text option. After completing the event collection program, the software automatically returns the clinician to the AIMS to finalize the anesthesia record. The number of events captured by the departmental quality assurance office increased by 92% (95% confidence interval [CI] 60.4%-130%) after system implementation. The major contributor to this increase was the new electronic system. This increase has been sustained over the initial 12 full months after implementation. Under our reporting criteria, the overall rate of clinical events reported by any method was 471 events out of 55,382 cases or 0.85% (95% CI 0.78% to 0.93%). The new system collected 67% of these events (95% confidence interval 63%-71%). We demonstrate the implementation in an academic anesthesia department of a secure clinical event reporting system linked to an AIMS. The system enforces entry of quality assurance information (either no clinical event or notification of a clinical event). System implementation resulted in capturing nearly twice the number of events at a relatively steady case load. © 2011 International Anesthesia Research Society
Lively Earthquake Activity in North-Eastern Greenland
NASA Astrophysics Data System (ADS)
Larsen, Tine B.; Dahl-Jensen, Trine; Voss, Peter H.
2016-04-01
The seismograph at the Danish military outpost, Station Nord (NOR) in North East Greenland, records many regional/local earthquakes every day. Most of these events originate at the Arctic plate boundary between the Eurasian and the North American plates. The plate boundary has a particularly active segment approximately 200 km from the seismograph. Additionally we find a seismically very active region 20-30 km from NOR on the Kronprins Christian Land peninsula. The BB seismograph at NOR was installed in 2002 and later upgraded with real-time telemetry as part of the GLISN-project. Since late 2013 data from NOR have been included in routine processing at GEUS. Phase readings on some of the older data, primarily 2002-2003, have been carried out previously in connection with other projects. As a result, phase readings for more than 6000 local events, recorded exclusively at NOR, were found in the GEUS data base. During the years 2004 to 2007 four locations were occupied by temporary BB seismographs on the North coast of Greenland as part of the Law of the Sea preparatory work. Data from these stations have not previously been analyzed for local and regional events. In this study we combine the recordings from NOR with phase readings from the temporary seismographs in Northern Greenland. The local events on Kronprins Christian Land range in magnitude from less than 2 to a 4.8 event widely recorded in the region and felt by the personnel at Station Nord on August 30, 2005. Station Nord is located in the seismically most active region of Greenland.
Building an Ontology for Identity Resolution in Healthcare and Public Health.
Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P; Clyde, Stephen; Thornton, Sidney; Staes, Catherine
2015-01-01
Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology's ability to model identity-changing events over time. We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage.
Atmospheric CO2 and abrupt climate change on submillennial timescales
NASA Astrophysics Data System (ADS)
Ahn, Jinho; Brook, Edward
2010-05-01
How atmospheric CO2 varies and is controlled on various time scales and under various boundary conditions is important for understanding how the carbon cycle and climate change are linked. Ancient air preserved in ice cores provides important information on past variations in atmospheric CO2. In particular, concentration records for intervals of abrupt climate change may improve understanding of mechanisms that govern atmospheric CO2. We present new multi-decadal CO2 records that cover Greenland stadial 9 (between Dansgaard-Oeschger (DO) events 8 and 9) and the abrupt cooling event at 8.2 ka. The CO2 records come from Antarctic ice cores but are well synchronized with Greenland ice core records using new high-resolution CH4 records,precisely defining the timing of CO2 change with respect to abrupt climate events in Greenland. Previous work showed that during stadial 9 (40~38 ka), CO2 rose by about 15~20 ppm over around 2,000 years, and at the same time temperatures in Antarctica increased. Dust proxies indicate a decrease in dust flux over the same period. With more detailed data and better age controls we now find that approximately half of the CO2 increase during stadial 9 occurred abruptly, over the course of decades to a century at ~39.6 ka. The step increase of CO2 is synchronous with a similar step increase of Antarctic isotopic temperature and a small abrupt change in CH4, and lags after the onset of decrease in dust flux by ~400 years. New atmospheric CO2 records at the well-known ~8.2 ka cooling event were obtained from Siple Dome ice core, Antarctica. Our preliminary CO2 data span 900 years and include 19 data points within the 8.2 ka cooling event, which persisted for ~160 years (Thomas et al., Quarternary Sci. Rev., 2007). We find that CO2 increased by 2~4 ppm during that cooling event. Further analyses will improve the resolution and better constrain the CO2 variability during other times in the early Holocene to determine if the variations observed during at 8.2 ka event are significant.
Cognitive complexity of the medical record is a risk factor for major adverse events.
Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot
2014-01-01
Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.
NASA Astrophysics Data System (ADS)
Karrenbach, M. H.; Cole, S.; Williams, J. J.; Biondi, B. C.; McMurtry, T.; Martin, E. R.; Yuan, S.
2017-12-01
Fiber-optic distributed acoustic sensing (DAS) uses conventional telecom fibers for a wide variety of monitoring purposes. Fiber-optic arrays can be located along pipelines for leak detection; along borders and perimeters to detect and locate intruders, or along railways and roadways to monitor traffic and identify and manage incidents. DAS can also be used to monitor oil and gas reservoirs and to detect earthquakes. Because thousands of such arrays are deployed worldwide and acquiring data continuously, they can be a valuable source of data for earthquake detection and location, and could potentially provide important information to earthquake early-warning systems. In this presentation, we show that DAS arrays in Mexico and the United States detected the M8.1 and M7.2 Mexico earthquakes in September 2017. At Stanford University, we have deployed a 2.4 km fiber-optic DAS array in a figure-eight pattern, with 600 channels spaced 4 meters apart. Data have been recorded continuously since September 2016. Over 800 earthquakes from across California have been detected and catalogued. Distant teleseismic events have also been recorded, including the two Mexican earthquakes. In Mexico, fiber-optic arrays attached to pipelines also detected these two events. Because of the length of these arrays and their proximity to the event locations, we can not only detect the earthquakes but also make location estimates, potentially in near real time. In this presentation, we review the data recorded for these two events recorded at Stanford and in Mexico. We compare the waveforms recorded by the DAS arrays to those recorded by traditional earthquake sensor networks. Using the wide coverage provided by the pipeline arrays, we estimate the event locations. Such fiber-optic DAS networks can potentially play a role in earthquake early-warning systems, allowing actions to be taken to minimize the impact of an earthquake on critical infrastructure components. While many such fiber-optic networks are already in place, new arrays can be created on demand, using existing fiber-optic telecom cables, for specific monitoring situations such as recording aftershocks of a large earthquake or monitoring induced seismicity.
Piller, Werner E.; Reuter, Markus; Harzhauser, Mathias
2015-01-01
Abstract During the Miocene prominent oxygen isotope events (Mi‐events) reflect major changes in glaciation, while carbonate isotope maxima (CM‐events) reflect changes in organic carbon burial, particularly during the Monterey carbon isotope excursion. However, despite their importance to the global climate history they have never been recorded in shallow marine carbonate successions. The Decontra section on the Maiella Platform (central Apennines, Italy), however, allows to resolve them for the first time in such a setting during the early to middle Miocene. The present study improves the stratigraphic resolution of parts of the Decontra section via orbital tuning of high‐resolution gamma ray (GR) and magnetic susceptibility data to the 405 kyr eccentricity metronome. The tuning allows, within the established biostratigraphic, sequence stratigraphic, and isotope stratigraphic frameworks, a precise correlation of the Decontra section with pelagic records of the Mediterranean region, as well as the global paleoclimatic record and the global sea level curve. Spectral series analyses of GR data further indicate that the 405 kyr orbital cycle is particularly well preserved during the Monterey Event. Since GR is a direct proxy for authigenic uranium precipitation during increased burial of organic carbon in the Decontra section, it follows the same long‐term orbital pacing as observed in the carbon isotope records. The 405 kyr GR beat is thus correlated with the carbon isotope maxima observed during the Monterey Event. Finally, the Mi‐events can now be recognized in the δ18O record and coincide with plankton‐rich, siliceous, or phosphatic horizons in the lithology of the section. PMID:27546980
NASA Astrophysics Data System (ADS)
Auer, Gerald; Piller, Werner E.; Reuter, Markus; Harzhauser, Mathias
2015-04-01
During the Miocene prominent oxygen isotope events (Mi-events) reflect major changes in glaciation, while carbonate isotope maxima (CM-events) reflect changes in organic carbon burial, particularly during the Monterey carbon isotope excursion. However, despite their importance to the global climate history they have never been recorded in shallow marine carbonate successions. The Decontra section on the Maiella Platform (central Apennines, Italy), however, allows to resolve them for the first time in such a setting during the early to middle Miocene. The present study improves the stratigraphic resolution of parts of the Decontra section via orbital tuning of high-resolution gamma ray (GR) and magnetic susceptibility data to the 405 kyr eccentricity metronome. The tuning allows, within the established biostratigraphic, sequence stratigraphic, and isotope stratigraphic frameworks, a precise correlation of the Decontra section with pelagic records of the Mediterranean region, as well as the global paleoclimatic record and the global sea level curve. Spectral series analyses of GR data further indicate that the 405 kyr orbital cycle is particularly well preserved during the Monterey Event. Since GR is a direct proxy for authigenic uranium precipitation during increased burial of organic carbon in the Decontra section, it follows the same long-term orbital pacing as observed in the carbon isotope records. The 405 kyr GR beat is thus correlated with the carbon isotope maxima observed during the Monterey Event. Finally, the Mi-events can now be recognized in the δ18O record and coincide with plankton-rich, siliceous, or phosphatic horizons in the lithology of the section.
NASA Astrophysics Data System (ADS)
Saylor, P. L.; Osterberg, E. C.; Kreutz, K. J.; Wake, C. P.; Winski, D.
2014-12-01
In May-June 2013, an NSF-funded team from Dartmouth College and the Universities of Maine and New Hampshire collected two 1000-year ice cores to bedrock from the summit plateau of Mount Hunter in Denali National Park, Alaska (62.940291, -151.087616, 3912 m). The snow accumulation record from these ice cores will provide key insight into late Holocene precipitation variability in central Alaska, and compliment existing precipitation paleorecords from the Mt. Logan and Eclipse ice cores in coastal SE Alaska. However, correct interpretation of the Mt. Hunter accumulation record requires an understanding of the relationships between regional meteorological events and micrometeorological conditions at the Mt. Hunter ice core collection site. Here we analyze a three-month window of snow accumulation and meteorological conditions recorded by an Automatic Weather Station (AWS) at the Mt. Hunter site during the summer of 2013. Snow accumulation events are identified in the Mt. Hunter AWS dataset, and compared on a storm-by-storm basis to AWS data collected from the adjacent Kahiltna glacier 2000 m lower in elevation, and to regional National Weather Service (NWS) station data. We also evaluate the synoptic conditions associated with each Mt. Hunter accumulation event using NWS surface maps, NCEP-NCAR Reanalysis data, and the NOAA HYSPLIT back trajectory model. We categorize each Mt. Hunter accumulation event as pure snow accumulation, drifting, or blowing snow events based on snow accumulation, wind speed and temperature data using the method of Knuth et al (2009). We analyze the frequency and duration of events within each accumulation regime, in addition to the overall contribution of each event to the snowpack. Preliminary findings indicate that a majority of Mt. Hunter accumulation events are of pure accumulation nature (55.5%) whereas drifting (28.6%) and blowing (15.4%) snow events play a secondary role. Our results will characterize the local accumulation dynamics on Mt. Hunter and quantify the relationship between alpine micrometeorological and regional precipitation dynamics, providing key insights into the interpretation of the Mt. Hunter paleoprecipitation record.
NASA Astrophysics Data System (ADS)
Williams, E. F.; Martin, E. R.; Biondi, B. C.; Lindsey, N.; Ajo Franklin, J. B.; Wagner, A. M.; Bjella, K.; Daley, T. M.; Dou, S.; Freifeld, B. M.; Robertson, M.; Ulrich, C.
2016-12-01
We analyze the impact of identifying and removing coherent anthropogenic noise on synthetic Green's functions extracted from ambient noise recorded on a dense linear distributed acoustic sensing (DAS) array. Low-cost, low-impact urban seismic surveys are possible with DAS, which uses dynamic strain sensing to record seismic waves incident to a buried fiber optic cable. However, interferometry and tomography of ambient noise data recorded in urban areas include coherent noise from near-field infrastructure such as cars and trains passing the array, in some cases causing artifacts in estimated Green's functions and potentially incorrect surface wave velocities. Based on our comparison of several methods, we propose an automated, real-time data processing workflow to detect and reduce the impact of these events on data from a dense array in an urban environment. We utilize a recursive STA/LTA (short-term average/long-term average) algorithm on each channel to identify sharp amplitude changes typically associated with an event arrival. In order to distinguish between optical noise and physical events, an event is cataloged only if STA/LTA is triggered on enough channels across the array in a short time window. For each event in the catalog, a conventional semblance analysis is performed across a straight segment of the array to determine whether the event has a coherent velocity signature. Events that demonstrate a semblance peak at low apparent velocities (5-50 m/s) are assumed to represent coherent transportation-related noise and are down-weighted in the time domain before cross-correlation. We show the impact of removing such noise on estimated Green's functions from ambient noise data recorded in Richmond, CA in December 2014. This method has been developed for use on a continuous time-lapse ambient noise survey collected with DAS near Fairbanks, AK, and an upcoming ambient noise survey on the Stanford University campus using DAS with a re-purposed telecommunications fiber optic cable.
The natural mathematics of behavior analysis.
Li, Don; Hautus, Michael J; Elliffe, Douglas
2018-04-19
Models that generate event records have very general scope regarding the dimensions of the target behavior that we measure. From a set of predicted event records, we can generate predictions for any dependent variable that we could compute from the event records of our subjects. In this sense, models that generate event records permit us a freely multivariate analysis. To explore this proposition, we conducted a multivariate examination of Catania's Operant Reserve on single VI schedules in transition using a Markov Chain Monte Carlo scheme for Approximate Bayesian Computation. Although we found systematic deviations between our implementation of Catania's Operant Reserve and our observed data (e.g., mismatches in the shape of the interresponse time distributions), the general approach that we have demonstrated represents an avenue for modelling behavior that transcends the typical constraints of algebraic models. © 2018 Society for the Experimental Analysis of Behavior.
Increasing use of high-speed digital imagery as a measurement tool on test and evaluation ranges
NASA Astrophysics Data System (ADS)
Haddleton, Graham P.
2001-04-01
In military research and development or testing there are various fast and dangerous events that need to be recorded and analysed. High-speed cameras allow the capture of movement too fast to be recognised by the human eye, and provide data that is essential for the analysis and evaluation of such events. High-speed photography is often the only type of instrumentation that can be used to record the parameters demanded by our customers. I will show examples where this applied cinematography is used not only to provide a visual record of events, but also as an essential measurement tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, K.J.; Dawkins, M.S.; Baumstark, R.R.
1976-02-24
Short-period signals associated with the NTS Event 'LEYDEN' on 26 November 1975 were recorded at a RK-ON and LASA. Station descriptions, arrival times, magitude of seismic waves, and seismic signatures are included.
Sammon, Cormac J; Petersen, Irene
2016-04-01
Studies using primary care databases often censor follow-up at the date data are last collected from clinical computer systems (last collection date (LCD)). We explored whether this results in the selective exclusion of events entered in the electronic health records after their date of occurrence, that is, backdated events. We used data from The Health Improvement Network (THIN). Using two versions of the database, we identified events that were entered into a later (THIN14) but not an earlier version of the database (THIN13) and investigated how the number of entries changed as a function of time since LCD. Times between events and the dates they were recorded were plotted as a function of time since the LCD in an effort to determine appropriate points at which to censor follow-up. There were 356 million eligible events in THIN14 and 355 million eligible events in THIN13. When comparing the two data sets, the proportion of missing events in THIN13 was highest in the month prior to the LCD (9.6%), decreasing to 5.2% at 6 months and 3.4% at 12 months. The proportion of missing events was largest for events typically diagnosed in secondary care such as neoplasms (28% in the month prior to LCD) and negligible for events typically diagnosed in primary care such as respiratory events (2% in the month prior to LCD). Studies using primary care databases, particularly those investigating events typically diagnosed outside primary care, should censor follow-up prior to the LCD to avoid underestimation of event rates. Copyright © 2016 John Wiley & Sons, Ltd.
Integrated Historical Tsunami Event and Deposit Database
NASA Astrophysics Data System (ADS)
Dunbar, P. K.; McCullough, H. L.
2010-12-01
The National Geophysical Data Center (NGDC) provides integrated access to historical tsunami event, deposit, and proxy data. The NGDC tsunami archive initially listed tsunami sources and locations with observed tsunami effects. Tsunami frequency and intensity are important for understanding tsunami hazards. Unfortunately, tsunami recurrence intervals often exceed the historic record. As a result, NGDC expanded the archive to include the Global Tsunami Deposits Database (GTD_DB). Tsunami deposits are the physical evidence left behind when a tsunami impacts a shoreline or affects submarine sediments. Proxies include co-seismic subsidence, turbidite deposits, changes in biota following an influx of marine water in a freshwater environment, etc. By adding past tsunami data inferred from the geologic record, the GTD_DB extends the record of tsunamis backward in time. Although the best methods for identifying tsunami deposits and proxies in the geologic record remain under discussion, developing an overall picture of where tsunamis have affected coasts, calculating recurrence intervals, and approximating runup height and inundation distance provides a better estimate of a region’s true tsunami hazard. Tsunami deposit and proxy descriptions in the GTD_DB were compiled from published data found in journal articles, conference proceedings, theses, books, conference abstracts, posters, web sites, etc. The database now includes over 1,200 descriptions compiled from over 1,100 citations. Each record in the GTD_DB is linked to its bibliographic citation where more information on the deposit can be found. The GTD_DB includes data for over 50 variables such as: event description (e.g., 2010 Chile Tsunami), geologic time period, year, deposit location name, latitude, longitude, country, associated body of water, setting during the event (e.g., beach, lake, river, deep sea), upper and lower contacts, underlying and overlying material, etc. If known, the tsunami source mechanism (e.g., earthquake, landslide, volcanic eruption, asteroid impact) is also specified. Observations (grain size, sedimentary structure, bed thickness, number of layers, etc.) are stored along with the conclusions drawn from the evidence by the author (wave height, flow depth, flow velocity, number of waves, etc.). Geologic time periods in the GTD_DB range from Precambrian to Quaternary, but the majority (70%) are from the Quaternary period. This period includes events such as: the 2004 Indian Ocean tsunami, the Cascadia subduction zone earthquakes and tsunamis, the 1755 Lisbon tsunami, the A.D. 79 Vesuvius tsunami, the 3500 BP Santorini caldera collapse and tsunami, and the 7000 BP Storegga landslide-generated tsunami. Prior to the Quaternary period, the majority of the paleotsunamis are due to impact events such as: the Tertiary Chesapeake Bay Bolide, Cretaceous-Tertiary (K/T) Boundary, Cretaceous Manson, and Devonian Alamo. The tsunami deposits are integrated with the historical tsunami event database where applicable. For example, users can search for articles describing deposits related to the 1755 Lisbon tsunami and view those records, as well as link to the related historic event record. The data and information may be viewed using tools designed to extract and display data (selection forms, Web Map Services, and Web Feature Services).
Nelson, Richard E; Grosse, Scott D; Waitzman, Norman J; Lin, Junji; DuVall, Scott L; Patterson, Olga; Tsai, James; Reyes, Nimia
2015-04-01
There are limitations to using administrative data to identify postoperative venous thromboembolism (VTE). We used a novel approach to quantify postoperative VTE events among Department of Veterans Affairs (VA) surgical patients during 2005-2010. We used VA administrative data to exclude patients with VTE during 12 months prior to surgery. We identified probable postoperative VTE events within 30 and 90 days post-surgery in three settings: 1) pre-discharge inpatient, using a VTE diagnosis code and a pharmacy record for anticoagulation; 2) post-discharge inpatient, using a VTE diagnosis code followed by a pharmacy record for anticoagulation within 7 days; and 3) outpatient, using a VTE diagnosis code and either anticoagulation or a therapeutic procedure code with natural language processing (NLP) to confirm acute VTE in clinical notes. Among 468,515 surgeries without prior VTE, probable VTEs were documented within 30 and 90 days in 3,931 (0.8%) and 5,904 (1.3%), respectively. Of probable VTEs within 30 or 90 days post-surgery, 47.8% and 62.9%, respectively, were diagnosed post-discharge. Among post-discharge VTE diagnoses, 86% resulted in a VA hospital readmission. Fewer than 25% of outpatient records with both VTE diagnoses and anticoagulation prescriptions were confirmed by NLP as acute VTE events. More than half of postoperative VTE events were diagnosed post-discharge; analyses of surgical discharge records are inadequate to identify postoperative VTE. The NLP results demonstrate that the combination of VTE diagnoses and anticoagulation prescriptions in outpatient administrative records cannot be used to validly identify postoperative VTE events. Copyright © 2015. Published by Elsevier Ltd.
Nelson, Richard E.; Grosse, Scott D.; Waitzman, Norman J.; Lin, Junji; DuVall, Scott L.; Patterson, Olga; Tsai, James; Reyes, Nimia
2015-01-01
Background There are limitations to using administrative data to identify postoperative venous thromboembolism (VTE). We used a novel approach to quantify postoperative VTE events among Department of Veterans Affairs (VA) surgical patients during 2005–2010. Methods We used VA administrative data to exclude patients with VTE during 12 months prior to surgery. We identified probable postoperative VTE events within 30 and 90 days post-surgery in three settings: 1) pre-discharge inpatient, using a VTE diagnosis code and a pharmacy record for anticoagulation; 2) post-discharge inpatient, using a VTE diagnosis code followed by a pharmacy record for anticoagulation within 7 days; and 3) outpatient, using a VTE diagnosis code and either anticoagulation or a therapeutic procedure code with natural language processing (NLP) to confirm acute VTE in clinical notes. Results Among 468,515 surgeries without prior VTE, probable VTEs were documented within 30 and 90 days in 3,931 (0.8%) and 5,904 (1.3%), respectively. Of probable VTEs within 30 or 90 days post-surgery, 47.8% and 62.9%, respectively, were diagnosed post-discharge. Among post-discharge VTE diagnoses, 86% resulted in a VA hospital readmission. Fewer than 25% of outpatient records with both VTE diagnoses and anticoagulation prescriptions were confirmed by NLP as acute VTE events. Conclusion More than half of postoperative VTE events were diagnosed post-discharge; analyses of surgical discharge records are inadequate to identify postoperative VTE. The NLP results demonstrate that the combination of VTE diagnoses and anticoagulation prescriptions in outpatient administrative records cannot be used to validly identify postoperative VTE events. PMID:25666908
The 3 April 2017 Botswana M6.5 Earthquake: Scientific Rapid Response
NASA Astrophysics Data System (ADS)
Midzi, V.; Jele, V.; Kwadiba, M. T.; Mantsha, R.; Manzunzu, B.; Mulabisana, T. F.; Ntibinyane, O.; Pule, T.; Saunders, I.; Tabane, L.; van Aswegen, G.; Zulu, B. S.
2017-12-01
An earthquake of magnitude M6.5 occurred in the evening of 3 April 2017 in Central Botswana. The event was well recorded by the regional network and located by both the Council for Geoscience (CGS) and United States Geological Survey (USGS). Its effects were felt widely in southern Africa and were especially pronounced for residence of Gauteng and the North West Province. In response to these events, the CGS, together with the Botswana Geoscience Institute (BGI), embarked on two scientific projects. The first involved the quick installation of a temporary network of six seismograph stations in and around the location of the main Botswana event with the purpose of detecting and recording its aftershocks. Initially the intention had been to record the events for a period of one month, but on realizing just how active the area was it was decided to extend the period to three months. Data recorded in the first month were collected and delivered to both the CGS and BGI for processing. Currently data recorded in April 2017 after the installation of the stations has been analysed and more than 500 located aftershocks identified. All are located at the eastern edge of the Central Kalahari Park near the location of the main event in clear two clusters. The observed clusters imply that a segmented fault is the source of these earthquakes and is oriented in a NW-SE direction. The second scientific project involved a macroseismic survey to study the extent and nature of the effects of the event in southern Africa. This involved CGS and BGI scientists conducting interviews of members of the public to extract as much information as possible. Other data were collected from questionnaires submitted online by the public. In total 180 questionnaires were obtained through interviews and 141 online from South Africa, Zimbabwe and Namibia. All collected data have been analysed to produce 76 intensity data points located all over the region, with maximum intensity values of VI (according to the Modified Mercalli Intensity scale) observed near the epicenter. These are quite low values of intensity for such a large event, but are to be expected given that the epicentral region is in a national park which is sparsely populated.
NASA Astrophysics Data System (ADS)
Dabrowa, A. L.; Green, D. N.; Johnson, J. B.; Phillips, J. C.; Rust, A. C.
2014-11-01
Local (100 s of metres from vent) monitoring of volcanic infrasound is a common tool at volcanoes characterized by frequent low-magnitude eruptions, but it is generally not safe or practical to have sensors so close to the vent during more intense eruptions. To investigate the potential and limitations of monitoring at near-regional ranges (10 s of km) we studied infrasound detection and propagation at Mount Erebus, Antarctica. This site has both a good local monitoring network and an additional International Monitoring System infrasound array, IS55, located 25 km away. We compared data recorded at IS55 with a set of 117 known Strombolian events that were recorded with the local network in January 2006. 75% of these events were identified at IS55 by an analyst looking for a pressure transient coincident with an F-statistic detection, which identifies coherent infrasound signals. With the data from January 2006, we developed and calibrated an automated signal-detection algorithm based on threshold values of both the F-statistic and the correlation coefficient. Application of the algorithm across IS55 data for all of 2006 identified infrasonic signals expected to be Strombolian explosions, and proved reliable for indicating trends in eruption frequency. However, detectability at IS55 of known Strombolian events depended strongly on the local signal amplitude: 90% of events with local amplitudes > 25 Pa were identified at IS55, compared to only 26% of events with local amplitudes < 25 Pa. Event detection was also affected by considerable variation in amplitude decay rates between the local and near-regional sensors. Amplitudes recorded at IS55 varied between 3% and 180% of the amplitude expected assuming hemispherical spreading, indicating that amplitudes recorded at near-regional ranges to Erebus are unreliable indicators of event magnitude. Comparing amplitude decay rates with locally collected radiosonde data indicates a close relationship between recorded amplitude and lower atmosphere effective sound speed structure. At times of increased sound speed gradient, higher amplitude decay rates are observed, consistent with increased upward refraction of acoustic energy along the propagation path. This study indicates that whilst monitoring activity levels at near-regional ranges can be successful, variable amplitude decay rate means quantitative analysis of infrasound data for eruption intensity and magnitude is not advisable without the consideration of local atmospheric sound speed structure.
NASA Astrophysics Data System (ADS)
Omenzetter, Piotr; Brownjohn, James M. W.; Moyo, Pilate
2003-08-01
Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure. However, converting large amount of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure in Singapore and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localizing sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.
Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise
NASA Astrophysics Data System (ADS)
Ziegler, Abra E.
The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was quantitatively evaluated during a variety of noise conditions and seismic detections were identified using AST and compared to ancillary injection data. During a period of CO2 injection in a nearby well to the monitoring array, 82% of seismic events were accurately detected, 13% of events were missed, and 5% of detections were determined to be false. Additionally, seismic risk was evaluated from the stress field and faulting regime at FWU to determine the likelihood of pressure perturbations to trigger slip on previously mapped faults. Faults oriented NW-SE were identified as requiring the smallest pore pressure changes to trigger slip and faults oriented N-S will also potentially be reactivated although this is less likely.
van Staa, Tjeerd-Pieter; Leufkens, Hubert G; Zhang, Bill; Smeeth, Liam
2009-12-01
Data on absolute risks of outcomes and patterns of drug use in cost-effectiveness analyses are often based on randomised clinical trials (RCTs). The objective of this study was to evaluate the external validity of published cost-effectiveness studies by comparing the data used in these studies (typically based on RCTs) to observational data from actual clinical practice. Selective Cox-2 inhibitors (coxibs) were used as an example. The UK General Practice Research Database (GPRD) was used to estimate the exposure characteristics and individual probabilities of upper gastrointestinal (GI) events during current exposure to nonsteroidal anti-inflammatory drugs (NSAIDs) or coxibs. A basic cost-effectiveness model was developed evaluating two alternative strategies: prescription of a conventional NSAID or coxib. Outcomes included upper GI events as recorded in GPRD and hospitalisation for upper GI events recorded in the national registry of hospitalisations (Hospital Episode Statistics) linked to GPRD. Prescription costs were based on the prescribed number of tables as recorded in GPRD and the 2006 cost data from the British National Formulary. The study population included over 1 million patients prescribed conventional NSAIDs or coxibs. Only a minority of patients used the drugs long-term and daily (34.5% of conventional NSAIDs and 44.2% of coxibs), whereas coxib RCTs required daily use for at least 6-9 months. The mean cost of preventing one upper GI event as recorded in GPRD was US$104k (ranging from US$64k with long-term daily use to US$182k with intermittent use) and US$298k for hospitalizations. The mean costs (for GPRD events) over calendar time were US$58k during 1990-1993 and US$174k during 2002-2005. Using RCT data rather than GPRD data for event probabilities, the mean cost was US$16k with the VIGOR RCT and US$20k with the CLASS RCT. The published cost-effectiveness analyses of coxibs lacked external validity, did not represent patients in actual clinical practice, and should not have been used to inform prescribing policies. External validity should be an explicit requirement for cost-effectiveness analyses.
NASA Astrophysics Data System (ADS)
Gwiazda, R.; Paull, C. K.; Kieft, B.; Bird, L.; Klimov, D.; Herlien, R.; Sherman, A.; McCann, M. P.; Sumner, E.; Talling, P.; Xu, J.; Parsons, D. R.; Maier, K. L.; Barry, J.
2017-12-01
Over a period of 18 months the Coordinated Canyon Experiment documented the passage of at least 15 sediment density flows in Monterey Canyon, offshore California, with an array of moorings and sensors placed from 200 m to 1,850 m water depths. Free-standing `smart' boulders (Benthic Event Detectors, BED) and a 1,000 Kg tripod with an Acoustic Monitoring Transponder (AMT) and a BED attached to it were deployed in the upper canyon to detect seabed motions during sediment density flows. BEDs consist of spheres made of a combination of metal, plastic and syntactic foam ballasted to 2.1 g/cm3 density, containing accelerometers along three orthogonal axes, a time recorder, and a pressure sensor inside a pressure case rated to 500 m water depth. Acceleration of ≥ 0.008 G triggers data collection at a recording rate of 50 Hz until motion stops. Built-in acoustic beacons and modems allow for BEDs to be relocated, and data to be downloaded, even when BEDs are buried in sediment to depths of >1 m. Over the course of the study, depth changes and velocities of 24 BED movements during 9 events were recorded. BEDs moved at the velocity of the propagation of the flows down canyon, as documented by the time of arrival of the flow at successive sensors, but sometimes travelled at lower speeds. Seven movements of the AMT tripod were also recorded. In the largest of these, the heavy AMT tripod was transported over a distance of 4.1 Km. For at least four of these seven motions the AMT temperature record indicates that the movements were initiated while the tripod was buried. In one particular event simultaneous movements of five BEDs over a 100 m depth range indicate that the entire seabed was in motion at the same time over a canyon distance of 3.5 Km. Reconstructions of instrument motions in this event from their internally recorded acceleration data show that the AMT displacement was at the front of the event and had no rotational component. In contrast, free standing BEDs at the same depth advanced through a combination of translational and rotational motion. These data are consistent with sediment density flows involving fluidization and motion of a segment of the seafloor over long distances.
Signal Analysis of Helicopter Blade-Vortex-Interaction Acoustic Noise Data
NASA Technical Reports Server (NTRS)
Rogers, James C.; Dai, Renshou
1998-01-01
Blade-Vortex-Interaction (BVI) produces annoying high-intensity impulsive noise. NASA Ames collected several sets of BVI noise data during in-flight and wind tunnel tests. The goal of this work is to extract the essential features of the BVI signals from the in-flight data and examine the feasibility of extracting those features from BVI noise recorded inside a large wind tunnel. BVI noise generating mechanisms and BVI radiation patterns an are considered and a simple mathematical-physical model is presented. It allows the construction of simple synthetic BVI events that are comparable to free flight data. The boundary effects of the wind tunnel floor and ceiling are identified and more complex synthetic BVI events are constructed to account for features observed in the wind tunnel data. It is demonstrated that improved recording of BVI events can be attained by changing the geometry of the rotor hub, floor, ceiling and microphone. The Euclidean distance measure is used to align BVI events from each blade and improved BVI signals are obtained by time-domain averaging the aligned data. The differences between BVI events for individual blades are then apparent. Removal of wind tunnel background noise by optimal Wiener-filtering is shown to be effective provided representative noise-only data have been recorded. Elimination of wind tunnel reflections by cepstral and optimal filtering deconvolution is examined. It is seen that the cepstral method is not applicable but that a pragmatic optimal filtering approach gives encouraging results. Recommendations for further work include: altering measurement geometry, real-time data observation and evaluation, examining reflection signals (particularly those from the ceiling) and performing further analysis of expected BVI signals for flight conditions of interest so that microphone placement can be optimized for each condition.
NASA Astrophysics Data System (ADS)
Testorf, M. E.; Jobst, B. C.; Kleen, J. K.; Titiz, A.; Guillory, S.; Scott, R.; Bujarski, K. A.; Roberts, D. W.; Holmes, G. L.; Lenck-Santini, P.-P.
2012-10-01
Time-frequency transforms are used to identify events in clinical EEG data. Data are recorded as part of a study for correlating the performance of human subjects during a memory task with pathological events in the EEG, called spikes. The spectrogram and the scalogram are reviewed as tools for evaluating spike activity. A statistical evaluation of the continuous wavelet transform across trials is used to quantify phase-locking events. For simultaneously improving the time and frequency resolution, and for representing the EEG of several channels or trials in a single time-frequency plane, a multichannel matching pursuit algorithm is used. Fundamental properties of the algorithm are discussed as well as preliminary results, which were obtained with clinical EEG data.
EARS : Repositioning data management near data acquisition.
NASA Astrophysics Data System (ADS)
Sinquin, Jean-Marc; Sorribas, Jordi; Diviacco, Paolo; Vandenberghe, Thomas; Munoz, Raquel; Garcia, Oscar
2016-04-01
The EU FP7 Projects Eurofleets and Eurofleets2 are an European wide alliance of marine research centers that aim to share their research vessels, to improve information sharing on planned, current and completed cruises, on details of ocean-going research vessels and specialized equipment, and to durably improve cost-effectiveness of cruises. Within this context logging of information on how, when and where anything happens on board of the vessel is crucial information for data users in a later stage. This forms a primordial step in the process of data quality control as it could assist in the understanding of anomalies and unexpected trends recorded in the acquired data sets. In this way completeness of the metadata is improved as it is recorded accurately at the origin of the measurement. The collection of this crucial information has been done in very different ways, using different procedures, formats and pieces of software in the context of the European Research Fleet. At the time that the Eurofleets project started, every institution and country had adopted different strategies and approaches, which complicated the task of users that need to log general purpose information and events on-board whenever they access a different platform loosing the opportunity to produce this valuable metadata on-board. Among the many goals the Eurofleets project has, a very important task is the development of an "event log software" called EARS (Eurofleets Automatic Reporting System) that enables scientists and operators to record what happens during a survey. EARS will allow users to fill, in a standardized way, the gap existing at the moment in metadata description that only very seldom links data with its history. Events generated automatically by acquisition instruments will also be handled, enhancing the granularity and precision of the event annotation. The adoption of a common procedure to log survey events and a common terminology to describe them is crucial to provide a friendly and successfully metadata on-board creation procedure for the whole the European Fleet. The possibility of automatically reporting metadata and general purpose data, will simplify the work of scientists and data managers with regards to data transmission. An improved accuracy and completeness of metadata is expected when events are recorded at acquisition time. This will also enhance multiple usages of the data as it allows verification of the different requirements existing in different disciplines.
Cognitive Complexity of the Medical Record Is a Risk Factor for Major Adverse Events
Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot
2014-01-01
Context: Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood because “patient complexity” has been difficult to quantify. Objective: We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. Design: The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. Main Outcome Measures: The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Results: Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. Conclusions: CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time. PMID:24626065
Building an Ontology for Identity Resolution in Healthcare and Public Health
Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P.; Clyde, Stephen; Thornton, Sidney; Staes, Catherine
2015-01-01
Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Objectives: Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology’s ability to model identity-changing events over time. Methods: We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. Results: We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. Conclusion: The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage. PMID:26392849
Bruland, Philipp; McGilchrist, Mark; Zapletal, Eric; Acosta, Dionisio; Proeve, Johann; Askin, Scott; Ganslandt, Thomas; Doods, Justin; Dugas, Martin
2016-11-22
Data capture is one of the most expensive phases during the conduct of a clinical trial and the increasing use of electronic health records (EHR) offers significant savings to clinical research. To facilitate these secondary uses of routinely collected patient data, it is beneficial to know what data elements are captured in clinical trials. Therefore our aim here is to determine the most commonly used data elements in clinical trials and their availability in hospital EHR systems. Case report forms for 23 clinical trials in differing disease areas were analyzed. Through an iterative and consensus-based process of medical informatics professionals from academia and trial experts from the European pharmaceutical industry, data elements were compiled for all disease areas and with special focus on the reporting of adverse events. Afterwards, data elements were identified and statistics acquired from hospital sites providing data to the EHR4CR project. The analysis identified 133 unique data elements. Fifty elements were congruent with a published data inventory for patient recruitment and 83 new elements were identified for clinical trial execution, including adverse event reporting. Demographic and laboratory elements lead the list of available elements in hospitals EHR systems. For the reporting of serious adverse events only very few elements could be identified in the patient records. Common data elements in clinical trials have been identified and their availability in hospital systems elucidated. Several elements, often those related to reimbursement, are frequently available whereas more specialized elements are ranked at the bottom of the data inventory list. Hospitals that want to obtain the benefits of reusing data for research from their EHR are now able to prioritize their efforts based on this common data element list.
A strong-motion database from the Central American subduction zone
NASA Astrophysics Data System (ADS)
Arango, Maria Cristina; Strasser, Fleur O.; Bommer, Julian J.; Hernández, Douglas A.; Cepeda, Jose M.
2011-04-01
Subduction earthquakes along the Pacific Coast of Central America generate considerable seismic risk in the region. The quantification of the hazard due to these events requires the development of appropriate ground-motion prediction equations, for which purpose a database of recordings from subduction events in the region is indispensable. This paper describes the compilation of a comprehensive database of strong ground-motion recordings obtained during subduction-zone events in Central America, focusing on the region from 8 to 14° N and 83 to 92° W, including Guatemala, El Salvador, Nicaragua and Costa Rica. More than 400 accelerograms recorded by the networks operating across Central America during the last decades have been added to data collected by NORSAR in two regional projects for the reduction of natural disasters. The final database consists of 554 triaxial ground-motion recordings from events of moment magnitudes between 5.0 and 7.7, including 22 interface and 58 intraslab-type events for the time period 1976-2006. Although the database presented in this study is not sufficiently complete in terms of magnitude-distance distribution to serve as a basis for the derivation of predictive equations for interface and intraslab events in Central America, it considerably expands the Central American subduction data compiled in previous studies and used in early ground-motion modelling studies for subduction events in this region. Additionally, the compiled database will allow the assessment of the existing predictive models for subduction-type events in terms of their applicability for the Central American region, which is essential for an adequate estimation of the hazard due to subduction earthquakes in this region.
Gradual onset and recovery of the Younger Dryas abrupt climate event in the tropics.
Partin, J W; Quinn, T M; Shen, C-C; Okumura, Y; Cardenas, M B; Siringan, F P; Banner, J L; Lin, K; Hu, H-M; Taylor, F W
2015-09-02
Proxy records of temperature from the Atlantic clearly show that the Younger Dryas was an abrupt climate change event during the last deglaciation, but records of hydroclimate are underutilized in defining the event. Here we combine a new hydroclimate record from Palawan, Philippines, in the tropical Pacific, with previously published records to highlight a difference between hydroclimate and temperature responses to the Younger Dryas. Although the onset and termination are synchronous across the records, tropical hydroclimate changes are more gradual (>100 years) than the abrupt (10-100 years) temperature changes in the northern Atlantic Ocean. The abrupt recovery of Greenland temperatures likely reflects changes in regional sea ice extent. Proxy data and transient climate model simulations support the hypothesis that freshwater forced a reduction in the Atlantic meridional overturning circulation, thereby causing the Younger Dryas. However, changes in ocean overturning may not produce the same effects globally as in Greenland.
Identification of major cardiovascular events in patients with diabetes using primary care data.
Pouwels, Koen Bernardus; Voorham, Jaco; Hak, Eelko; Denig, Petra
2016-04-02
Routine primary care data are increasingly being used for evaluation and research purposes but there are concerns about the completeness and accuracy of diagnoses and events captured in such databases. We evaluated how well patients with major cardiovascular disease (CVD) can be identified using primary care morbidity data and drug prescriptions. The study was conducted using data from 17,230 diabetes patients of the GIANTT database and Dutch Hospital Data register. To estimate the accuracy of the different measures, we analyzed the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) relative to hospitalizations and/or records with a diagnosis indicating major CVD, including ischaemic heart diseases and cerebrovascular events. Using primary care morbidity data, 43% of major CVD hospitalizations could be identified. Adding drug prescriptions to the search increased the sensitivity up to 94%. A proxy of at least one prescription of either a platelet aggregation inhibitor, vitamin k antagonist or nitrate could identify 85% of patients with a history of major CVD recorded in primary care, with an NPV of 97%. Using the same proxy, 57% of incident major CVD recorded in primary or hospital care could be identified, with an NPV of 99%. A substantial proportion of major CVD hospitalizations was not recorded in primary care morbidity data. Drug prescriptions can be used in addition to diagnosis codes to identify more patients with major CVD, and also to identify patients without a history of major CVD.
History of Fire Events in the U.S. Commercial Nuclear Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bijan Najafi; Joglar-Biloch, Francisco; Kassawara, Robert P.
2002-07-01
Over the past decade, interest in performance-based fire protection has increased within the nuclear industry. In support of this growing interest, in 1997 the Electric Power Research Institute (EPRI) developed a long-range plan to develop/improve data and tools needed to support Risk-Informed/Performance-Based fire protection. This plan calls for continued improvement in collection and use of information obtained from fire events at nuclear plants. The data collection process has the objectives of improving the insights gained from such data and reducing the uncertainty in fire risk and fire modeling methods in order to make them a more reliable basis for performancemore » based fire protection programs. In keeping with these objectives, EPRI continues to collect, review and analyze fire events in support of the nuclear industry. EPRI collects these records in cooperation with the Nuclear Electric Insurance Limited (NEIL), by compiling public fire event reports and by direct solicitation of U.S. nuclear facilities. EPRI fire data collection project is based on the principle that the understanding of history is one of the cornerstones of improving fire protection technology and practice. Therefore, the goal has been to develop and maintain a comprehensive database of fire events with flexibility to support various aspects of fire protection engineering. With more than 1850 fire records over a period of three decades and 2400 reactor years, this is the most comprehensive database of nuclear power industry fire events in existence today. In general, the frequency of fires in the U.S. commercial nuclear industry remains constant. In few cases, e.g., transient fires and fires in BWR offgas/recombiner systems, where either increasing or decreasing trends are observed, these trends tend to slow after 1980. The key issues in improving quality of the data remain to be consistency of the recording and reporting of fire events and difficulties in collection of records. EPRI has made significant progress towards improving the quality of the fire events data through use of multiple collection methods as well as its review and verification. To date EPRI has used this data to develop a generic fire ignition frequency model for U.S. nuclear power industry (Ref. 1, 4 and 5) as well as to support other models in support of EPRI Fire Risk Methods such as a cable fire manual suppression model. EPRI will continue its effort to collect and analyze operating data to support risk informed/performance based fire safety engineering, including collection and analysis of impairment data for fire protection systems and features. This paper provides details on the collection and application of fire events to risk informed/performance based fire protection. The paper also provides valuable insights into improving both collection and use of fire events data. (authors)« less
High-Rate Data-Capture for an Airborne Lidar System
NASA Technical Reports Server (NTRS)
Valett, Susan; Hicks, Edward; Dabney, Philip; Harding, David
2012-01-01
A high-rate data system was required to capture the data for an airborne lidar system. A data system was developed that achieved up to 22 million (64-bit) events per second sustained data rate (1408 million bits per second), as well as short bursts (less than 4 s) at higher rates. All hardware used for the system was off the shelf, but carefully selected to achieve these rates. The system was used to capture laser fire, single-photon detection, and GPS data for the Slope Imaging Multi-polarization Photo-counting Lidar (SIMPL). However, the system has applications for other laser altimeter systems (waveform-recording), mass spectroscopy, xray radiometry imaging, high-background- rate ranging lidar, and other similar areas where very high-speed data capture is needed. The data capture software was used for the SIMPL instrument that employs a micropulse, single-photon ranging measurement approach and has 16 data channels. The detected single photons are from two sources those reflected from the target and solar background photons. The instrument is non-gated, so background photons are acquired for a range window of 13 km and can comprise many times the number of target photons. The highest background rate occurs when the atmosphere is clear, the Sun is high, and the target is a highly reflective surface such as snow. Under these conditions, the total data rate for the 16 channels combined is expected to be approximately 22 million events per second. For each photon detection event, the data capture software reads the relative time of receipt, with respect to a one-per-second absolute time pulse from a GPS receiver, from an event timer card with 0.1-ns precision, and records that information to a RAID (Redundant Array of Independent Disks) storage device. The relative time of laser pulse firings must also be read and recorded with the same precision. Each of the four event timer cards handles the throughput from four of the channels. For each detection event, a flag is recorded that indicates the source channel. To accommodate the expected maximum count rate and also handle the other extreme of very low rates occurring during nighttime operations, the software requests a set amount of data from each of the event timer cards and buffers the data. The software notes if any of the cards did not return all the data requested and then accommodates that lower rate. The data is buffered to minimize the I/O overhead of writing the data to storage. Care was taken to optimize the reads from the cards, the speed of the I/O bus, and RAID configuration.
Meteorite Falls Observed in U.S. Weather Radar Data in 2015 and 2016 (To Date)
NASA Technical Reports Server (NTRS)
Fries, Marc; Fries, Jeffrey; Hankey, Mike; Matson, Robert
2016-01-01
To date, over twenty meteorite falls have been located in the weather radar imagery of the National Oceanic and Atmospheric Administration (NOAA)'s NEXRAD radar network. We present here the most prominent events recorded since the last Meteoritical Society meeting, covering most of 2015 and early 2016. Meteorite Falls: The following events produced evidence of falling meteorites in radar imagery and resulted in meteorites recovered at the fall site. Creston, CA (24 Oct 2015 0531 UTC): This event generated 218 eyewitness reports submitted to the American Meteor Society (AMS) and is recorded as event #2635 for 2015 on the AMS website. Witnesses reported a bright fireball with fragmentation terminating near the city of Creston, CA, north of Los Angeles. Sonic booms and electrophonic noise were reported in the vicinity of the event. Weather radar imagery records signatures consistent with falling meteorites in data from the KMUX, KVTX, KHNX and KVBX. The Meteoritical Society records the Creston fall as an L6 meteorite with a total recovered mass of 688g. Osceola, FL (24 Jan 2016 1527 UTC): This daytime fireball generated 134 eyewitness reports on AMS report number 266 for 2016, with one credible sonic boom report. The fireball traveled roughly NE to SW with a terminus location north of Lake City, FL in sparsely populated, forested countryside. Radar imagery shows distinct and prominent evidence of a significant meteorite fall with radar signatures seen in data from the KJAX and KVAX radars. Searchers at the fall site found that recoveries were restricted to road sites by the difficult terrain, and yet several meteorites were recovered. Evidence indicates that this was a relatively large meteorite fall where most of the meteorites are unrecoverable due to terrain. Osceola is an L6 meteorite with 991 g total mass recovered to date. Mount Blanco, TX (18 Feb 2016 0343 UTC): This event produced only 39 eyewitness reports and is recorded as AMS event #635 for 2016. No reports of sonic booms or electrophonic noise are recorded in the AMS eyewitness reports, but videos of the event show a relatively long-lasting fireball with fragmentation. Evidence of falling meteorites is seen in radar imagery from the KAMA and KLBB radars defining a roughly WNW to ESE trend with the dominant wind direction. This event featured favorable search ground composed mostly of farmland and ranchland and was extensively searched. Rather surprisingly, only a single L5 chondrite of 36.2g has been recovered to date.
Use of the Hadoop structured storage tools for the ATLAS EventIndex event catalogue
NASA Astrophysics Data System (ADS)
Favareto, A.
2016-09-01
The ATLAS experiment at the LHC collects billions of events each data-taking year, and processes them to make them available for physics analysis in several different formats. An even larger amount of events is in addition simulated according to physics and detector models and then reconstructed and analysed to be compared to real events. The EventIndex is a catalogue of all events in each production stage; it includes for each event a few identification parameters, some basic non-mutable information coming from the online system, and the references to the files that contain the event in each format (plus the internal pointers to the event within each file for quick retrieval). Each EventIndex record is logically simple but the system has to hold many tens of billions of records, all equally important. The Hadoop technology was selected at the start of the EventIndex project development in 2012 and proved to be robust and flexible to accommodate this kind of information; both the insertion and query response times are acceptable for the continuous and automatic operation that started in Spring 2015. This paper describes the EventIndex data input and organisation in Hadoop and explains the operational challenges that were overcome in order to achieve the expected performance.
Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras
Harris, A.J.L.; Thornber, C.R.
1999-01-01
GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.
Praet, Nore; Moernaut, Jasper; Van Daele, Maarten; Boes, Evelien; Haeussler, Peter J.; Strupler, Michael; Schmidt, Sabine; Loso, Michael G.; De Batist, Marc
2017-01-01
Sublacustrine landslide stratigraphy is considered useful for quantitative paleoseismology in low-seismicity settings. However, as the recharging of underwater slopes with sediments is one of the factors that governs the recurrence of slope failures, it is not clear if landslide deposits can provide continuous paleoseismic records in settings of frequent strong shaking. To test this, we selected three lakes in south-central Alaska that experienced a strong historical megathrust earthquake (the 1964 Mw9.2 Great Alaska Earthquake) and exhibit high sedimentation rates in their main basins (0.2 cm yr-1 -1.0 cm yr-1). We present high-resolution reflection seismic data (3.5 kHz) and radionuclide data from sediment cores in order to investigate factors that control the establishment of a reliable landslide record. Seismic stratigraphy analysis reveals the presence of several landslide deposits in the lacustrine sedimentary infill. Most of these landslide deposits can be attributed to specific landslide events, as multiple landslide deposits sourced from different lacustrine slopes occur on a single stratigraphic horizon. We identify numerous events in the lakes: Eklutna Lake proximal basin (14 events), Eklutna Lake distal basin (8 events), Skilak Lake (7 events) and Kenai Lake (7 events). The most recent event in each basin corresponds to the historic 1964 megathrust earthquake. All events are characterized by multiple landslide deposits, which hints at a regional trigger mechanism, such as an earthquake (the synchronicity criterion). This means that the landslide record in each basin represents a record of past seismic events. Based on extrapolation of sedimentation rates derived from radionuclide dating, we roughly estimate a mean recurrence interval in the Eklutna Lake proximal basin, Eklutna Lake distal basin, Skilak Lake and Kenai Lake, at ~ 250 yrs, ~ 450 yrs, ~ 900 yrs and ~ 450 yrs, respectively. This distinct difference in recording can be explained by variations in preconditioning factors like slope angle, slope recharging (sedimentation rate) and the sediment source area: faster slope recharging and a predominance of delta and alluvial fan failures, increase the sensitivity and lower the intensity threshold for slope instability. Also, the seismotectonic setting of the lakes has to be taken into account. This study demonstrates that sublacustrine landslides in several Alaskan lakes can be used as reliable recorders of strong earthquake shaking, when a multi-lake approach is used, and can enhance the temporal and spatial resolution of the paleoseismic record of south-central Alaska.
Identification and analysis of long duration low frequency events from microseismic data
NASA Astrophysics Data System (ADS)
Hu, H.; Li, A.
2016-12-01
Long duration low frequency (LDLF) earthquakes, which are commonly present in volcanic fields and subduction zones, have been observed from microseismic data. In this research, we have identified and located several LDLF events from a microseismic dataset acquired by surface receivers in the Eagle Ford Shale. The LDLF events are clearly identified on frequency-time plots with the central frequencies at 5-25 Hz and the duration time from tens of seconds up to 100 seconds. We pick the arrival times of the events using the envelops of the filtered data and apply a grid search method to find the source locations. These events are located at the depth around 1500 m, close to the horizontal treatment well for hydraulic fracturing. The associated phase arrivals show typical P-wave moveout trends. In addition, these events tend to migrate away from the horizontal well with time. Furthermore, these events are recorded only during the time when the rock is breaking according to the treating pressure records. Considering all these observations, we conclude that the observed LDLF events are caused by the pressure change related to fluid flow in fractures. The time-dependence source locations could have an important application to characterize the fluid path inside fractures.
Viking-2 Seismometer Measurements on Mars: PDS Data Archive and Meteorological Applications
NASA Astrophysics Data System (ADS)
Lorenz, Ralph D.; Nakamura, Yosio; Murphy, James R.
2017-11-01
A data product has been generated and archived on the NASA Planetary Data System (Geosciences Node), which presents the seismometer readings of Viking Lander 2 in an easy-to-access form, for both the raw ("high rate") waveform records and the compressed ("event mode") amplitude and frequency records. In addition to the records themselves, a separate summary file for each instrument mode lists key statistics of each record together with the meteorological measurements made closest in time to the seismic record. This juxtaposition facilitates correlation of the seismometer instrument response to different meteorological conditions, or the selection of seismic data during which wind disturbances can be expected to be small. We summarize data quality issues and also discuss lander-generated seismic signals, due to operation of the sampling arm or other systems, which may be of interest for prospective missions to other bodies. We review wind-seismic correlation, the "Martian solar day (sol) 80" candidate seismic event, and identify the seismic signature of a probable dust devil vortex on sol 482 : the seismometer data allow an estimate of the peak wind, occurring between coarsely spaced meteorology measurements. We present code to generate the plots in this paper to illustrate use of the data product.
NASA Astrophysics Data System (ADS)
Damaschke, Magret; Cronin, Shane J.; Bebbington, Mark S.
2018-01-01
Robust time-varying volcanic hazard assessments are difficult to develop, because they depend upon having a complete and extensive eruptive activity record. Missing events in eruption records are endemic, due to poor preservation or erosion of tephra and other volcanic deposits. Even with many stratigraphic studies, underestimation or overestimation of eruption numbers is possible due to mis-matching tephras with similar chemical compositions or problematic age models. It is also common to have gaps in event coverage due to sedimentary records not being available in all directions from the volcano, especially downwind. Here, we examine the sensitivity of probabilistic hazard estimates using a suite of four new and two existing high-resolution tephra records located around Mt. Taranaki, New Zealand. Previous estimates were made using only single, or two correlated, tephra records. In this study, tephra data from six individual sites in lake and peat bogs covering an arc of 120° downwind of the volcano provided an excellent temporal high-resolution event record. The new data confirm a previously identified semi-regular pattern of variable eruption frequency at Mt. Taranaki. Eruption intervals exhibit a bimodal distribution, with eruptions being an average of 65 years apart, and in 2% of cases, centuries separate eruptions. The long intervals are less common than seen in earlier studies, but they have not disappeared with the inclusion of our comprehensive new dataset. Hence, the latest long interval of quiescence, since AD 1800, is unusual, but not out of character with the volcano. The new data also suggest that one of the tephra records (Lake Rotokare) used in earlier work had an old carbon effect on age determinations. This shifted ages of the affected tephras so that they were not correlated to other sites, leading to an artificially high eruption frequency in the previous combined record. New modelled time-varying frequency estimates suggest a 33-42% probability of an explosive eruption from Mt. Taranaki in the next 50 years, which is significantly lower than suggested by previous studies. This work also demonstrates some of the pitfalls to be avoided in combining stratigraphic records for eruption forecasting.
The Use of Intensity Scales In Exploiting Tsunami Historical Databases
NASA Astrophysics Data System (ADS)
Barberopoulou, A.; Scheele, F.
2015-12-01
Post-disaster assessments for historical tsunami events (>15 years old) are either scarce or contain limited information. In this study, we are assessing ways to examine tsunami impacts by utilizing data from old events, but more importantly we examine how to best utilize information contained in tsunami historical databases, in order to provide meaningful products that describe the impact of the event. As such, a tsunami intensity scale was applied to two historical events that were observed in New Zealand (one local and one distant), in order to utilize the largest possible number of observations in our dataset. This is especially important for countries like New Zealand where the tsunami historical record is short, going back to only the 19th century, and where instrument recordings are only available for the most recent events. We found that despite a number of challenges in using intensities -uncertainties partly due to limitations of historical event data - these data with the help of GIS tools can be used to produce hazard maps and offer an alternative way to exploit tsunami historical records. Most importantly the assignment of intensities at each point of observation allows for utilization of many more observations than if one depends on physical information alone, such as water heights. We hope these results may be used towards developing a well-defined methodology for hazard assessments, and refine our knowledge for past tsunami events for which the tsunami sources are largely unknown, and also for when physical quantities describing the tsunami (e.g. water height, flood depth, run-up) are scarce.
Luan, Shiwei; Gude, Dana; Prakash, Punit; Warren, Steve
2014-01-01
Behavior tracking with severely disabled children can be a challenge, since dealing directly with a child's behavior is more immediately pressing than the need to record an event for tracking purposes. By the time a paraeducator (`para') is able to break away and record events, behavior counts can be forgotten. This paper presents a paraeducator glove design that can help to track behaviors with minimal distraction by allowing a paraeducator to touch their thumb to one of their other four fingers, where each finger represents a different behavior. Count data are packaged by a microcontroller board on the glove and then sent wirelessly to a smart phone via a Bluetooth Low Energy (BLE) link. A customized BLE profile was designed for this application to promote real-time recording. These data can be forwarded to a database for further analysis. This para glove design addresses basic needs of a wearable device that employs BLE, including local data collection, BLE data transmission, and remote data recording. More functional sensors can be added to this platform to support other wearable scenarios.
Leveraging Data Intensive Computing to Support Automated Event Services
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Freeman, Shawn M.; Kuo, Kwo-Sen
2012-01-01
A large portion of Earth Science investigations is phenomenon- or event-based, such as the studies of Rossby waves, mesoscale convective systems, and tropical cyclones. However, except for a few high-impact phenomena, e.g. tropical cyclones, comprehensive records are absent for the occurrences or events of these phenomena. Phenomenon-based studies therefore often focus on a few prominent cases while the lesser ones are overlooked. Without an automated means to gather the events, comprehensive investigation of a phenomenon is at least time-consuming if not impossible. An Earth Science event (ES event) is defined here as an episode of an Earth Science phenomenon. A cumulus cloud, a thunderstorm shower, a rogue wave, a tornado, an earthquake, a tsunami, a hurricane, or an EI Nino, is each an episode of a named ES phenomenon," and, from the small and insignificant to the large and potent, all are examples of ES events. An ES event has a finite duration and an associated geolocation as a function of time; its therefore an entity in four-dimensional . (4D) spatiotemporal space. The interests of Earth scientists typically rivet on Earth Science phenomena with potential to cause massive economic disruption or loss of life, but broader scientific curiosity also drives the study of phenomena that pose no immediate danger. We generally gain understanding of a given phenomenon by observing and studying individual events - usually beginning by identifying the occurrences of these events. Once representative events are identified or found, we must locate associated observed or simulated data prior to commencing analysis and concerted studies of the phenomenon. Knowledge concerning the phenomenon can accumulate only after analysis has started. However, except for a few high-impact phenomena. such as tropical cyclones and tornadoes, finding events and locating associated data currently may take a prohibitive amount of time and effort on the part of an individual investigator. And even for these high-impact phenomena, the availability of comprehensive records is still only a recent development. A major reason for the lack of comprehensive ,records for the majority of the ES phenomena is the perception that they do not pose immediate and/or severe threat to life and property and are thus not consistently tracked. monitored, and catalogued. Many phenomena even lack commonly accepted criteria for definitions. However. the lack of comprehensive records is also due to the increasingly prohibitive volume of observations and model data that must be examined. NASA Earth Observing System Data Information System (EOSDIS) alone archives several petabytes (PB) of satellite remote sensing data and steadily increases. All of these factors contribute to the difficulty of methodically identifying events corresponding to a given phenomenon and significantly impede systematic investigations. In the following we present a couple motivating scenarios, demonstrating the issues faced by Earth scientists studying ES phenomena.
Sjulson, Lucas; Miesenböck, Gero
2007-02-01
Optical imaging of physiological events in real time can yield insights into biological function that would be difficult to obtain by other experimental means. However, the detection of all-or-none events, such as action potentials or vesicle fusion events, in noisy single-trial data often requires a careful balance of tradeoffs. The analysis of such experiments, as well as the design of optical reporters and instrumentation for them, is aided by an understanding of the principles of signal detection. This review illustrates these principles, using as an example action potential recording with optical voltage reporters.
Event-Based Processing of Neutron Scattering Data
Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; ...
2015-09-16
Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniquesmore » will be shown for comparison.« less
The Quaternary fossil-pollen record and global change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimm, E.C.
Fossil pollen provide one of the most valuable records of vegetation and climate change during the recent geological past. Advantages of the fossil-pollen record are that deposits containing fossil pollen are widespread, especially in areas having natural lakes, that fossil pollen occurs in continuous stratigraphic sequences spanning millennia, and that fossil pollen occurs in quantitative assemblages permitting a multivariate approach for reconstructing past vegetation and climates. Because of stratigraphic continuity, fossil pollen records climate cycles on a wide range of scales, from annual to the 100 ka Milankovitch cycles. Receiving particular emphasis recently are decadal to century scale changes, possiblemore » from the sediments of varved lakes, and late Pleistocene events on a 5--10 ka scale possibly correlating with the Heinrich events in the North Atlantic marine record or the Dansgaard-Oeschger events in the Greenland ice-core record. Researchers have long reconstructed vegetation and climate by qualitative interpretation of the fossil-pollen record. Recently quantitative interpretation has developed with the aid of large fossil-pollen databases and sophisticated numerical models. In addition, fossil pollen are important climate proxy data for validating General Circulation Models, which are used for predicting the possible magnitude future climate change. Fossil-pollen data also contribute to an understanding of ecological issues associated with global climate change, including questions of how and how rapidly ecosystems might respond to abrupt climate change.« less
Hanskamp-Sebregts, Mirelle; Zegers, Marieke; Vincent, Charles; van Gurp, Petra J; de Vet, Henrica C W; Wollersheim, Hub
2016-01-01
Objectives Record review is the most used method to quantify patient safety. We systematically reviewed the reliability and validity of adverse event detection with record review. Design A systematic review of the literature. Methods We searched PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Library and from their inception through February 2015. We included all studies that aimed to describe the reliability and/or validity of record review. Two reviewers conducted data extraction. We pooled κ values (κ) and analysed the differences in subgroups according to number of reviewers, reviewer experience and training level, adjusted for the prevalence of adverse events. Results In 25 studies, the psychometric data of the Global Trigger Tool (GTT) and the Harvard Medical Practice Study (HMPS) were reported and 24 studies were included for statistical pooling. The inter-rater reliability of the GTT and HMPS showed a pooled κ of 0.65 and 0.55, respectively. The inter-rater agreement was statistically significantly higher when the group of reviewers within a study consisted of a maximum five reviewers. We found no studies reporting on the validity of the GTT and HMPS. Conclusions The reliability of record review is moderate to substantial and improved when a small group of reviewers carried out record review. The validity of the record review method has never been evaluated, while clinical data registries, autopsy or direct observations of patient care are potential reference methods that can be used to test concurrent validity. PMID:27550650
Thorn, Joanna C; Turner, Emma L; Hounsome, Luke; Walsh, Eleanor; Down, Liz; Verne, Julia; Donovan, Jenny L; Neal, David E; Hamdy, Freddie C; Martin, Richard M; Noble, Sian M
2016-01-01
Objectives To evaluate the accuracy of routine data for costing inpatient resource use in a large clinical trial and to investigate costing methodologies. Design Final-year inpatient cost profiles were derived using (1) data extracted from medical records mapped to the National Health Service (NHS) reference costs via service codes and (2) Hospital Episode Statistics (HES) data using NHS reference costs. Trust finance departments were consulted to obtain costs for comparison purposes. Setting 7 UK secondary care centres. Population A subsample of 292 men identified as having died at least a year after being diagnosed with prostate cancer in Cluster randomised triAl of PSA testing for Prostate cancer (CAP), a long-running trial to evaluate the effectiveness and cost-effectiveness of prostate-specific antigen (PSA) testing. Results Both inpatient cost profiles showed a rise in costs in the months leading up to death, and were broadly similar. The difference in mean inpatient costs was £899, with HES data yielding ∼8% lower costs than medical record data (differences compatible with chance, p=0.3). Events were missing from both data sets. 11 men (3.8%) had events identified in HES that were all missing from medical record review, while 7 men (2.4%) had events identified in medical record review that were all missing from HES. The response from finance departments to requests for cost data was poor: only 3 of 7 departments returned adequate data sets within 6 months. Conclusions Using HES routine data coupled with NHS reference costs resulted in mean annual inpatient costs that were very similar to those derived via medical record review; therefore, routinely available data can be used as the primary method of costing resource use in large clinical trials. Neither HES nor medical record review represent gold standards of data collection. Requesting cost data from finance departments is impractical for large clinical trials. Trial registration number ISRCTN92187251; Pre-results. PMID:27130167
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexeyev, E. N., E-mail: alexeyev@ms2.inr.ac.r
A possible explanation of the time correlations between the data from underground detectors (Baksan telescope, LSD, IMB, Kamiokande II) and from the Rome and Maryland gravitational-wave antennas obtained during the Supernova 1987A explosion is proposed. It is shown that the synchronization of the events recorded by various underground facilities could be produced by gravitational radiation from the Supernova.
Use of Synchronized Phasor Measurements for Model Validation in ERCOT
NASA Astrophysics Data System (ADS)
Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill
2013-05-01
This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.
A Hot-Deck Multiple Imputation Procedure for Gaps in Longitudinal Recurrent Event Histories
Wang, Chia-Ning; Little, Roderick; Nan, Bin; Harlow, Siobán D.
2012-01-01
Summary We propose a regression-based hot deck multiple imputation method for gaps of missing data in longitudinal studies, where subjects experience a recurrent event process and a terminal event. Examples are repeated asthma episodes and death, or menstrual periods and the menopause, as in our motivating application. Research interest concerns the onset time of a marker event, defined by the recurrent-event process, or the duration from this marker event to the final event. Gaps in the recorded event history make it difficult to determine the onset time of the marker event, and hence, the duration from onset to the final event. Simple approaches such as jumping gap times or dropping cases with gaps have obvious limitations. We propose a procedure for imputing information in the gaps by substituting information in the gap from a matched individual with a completely recorded history in the corresponding interval. Predictive Mean Matching is used to incorporate information on longitudinal characteristics of the repeated process and the final event time. Multiple imputation is used to propagate imputation uncertainty. The procedure is applied to an important data set for assessing the timing and duration of the menopausal transition. The performance of the proposed method is assessed by a simulation study. PMID:21361886
Gender Differences in Memory Processing: Evidence from Event-Related Potentials to Faces
ERIC Educational Resources Information Center
Guillem, F.; Mograss, M.
2005-01-01
This study investigated gender differences on memory processing using event-related potentials (ERPs). Behavioral data and ERPs were recorded in 16 males and 10 females during a recognition memory task for faces. The behavioral data results showed that females performed better than males. Gender differences on ERPs were evidenced over anterior…
Trend Detection and Bivariate Frequency Analysis for Nonstrationary Rainfall Data
NASA Astrophysics Data System (ADS)
Joo, K.; Kim, H.; Shin, J. Y.; Heo, J. H.
2017-12-01
Multivariate frequency analysis has been developing for hydro-meteorological data such as rainfall, flood, and drought. Particularly, the copula has been used as a useful tool for multivariate probability model which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition (IETD) and each rainfall event has a rainfall depth and rainfall duration. In addition, nonstationarity in rainfall event has been studied recently due to climate change and trend detection of rainfall event is important to determine the data has nonstationarity or not. With the rainfall depth and duration of a rainfall event, trend detection and nonstationary bivariate frequency analysis has performed in this study. 62 stations from Korea Meteorological Association (KMA) over 30 years of hourly recorded data used in this study and the suitability of nonstationary copula for rainfall event has examined by the goodness-of-fit test.
NASA Astrophysics Data System (ADS)
Yenier, E.; Baturan, D.; Karimi, S.
2016-12-01
Monitoring of seismicity related to oil and gas operations is routinely performed nowadays using a number of different surface and downhole seismic array configurations and technologies. Here, we provide a hydraulic fracture (HF) monitoring case study that compares the data set generated by a sparse local surface network of broadband seismometers to a data set generated by a single downhole geophone string. Our data was collected during a 5-day single-well HF operation, by a temporary surface network consisting of 10 stations deployed within 5 km of the production well. The downhole data was recorded by a 20 geophone string deployed in an observation well located 15 m from the production well. Surface network data processing included standard STA/LTA event triggering enhanced by template-matching subspace detection, grid search locations which was improved using the double-differencing re-location technique, as well as Richter (ML) and moment (Mw) magnitude computations for all detected events. In addition, moment tensors were computed from first motion polarities and amplitudes for the subset of highest SNR events. The resulting surface event catalog shows a very weak spatio-temporal correlation to HF operations with only 43% of recorded seismicity occurring during HF stages times. This along with source mechanisms shows that the surface-recorded seismicity delineates the activation of several pre-existing structures striking NNE-SSW and consistent with regional stress conditions as indicated by the orientation of SHmax. Comparison of the sparse-surface and single downhole string datasets allows us to perform a cost-benefit analysis of the two monitoring methods. Our findings show that although the downhole array recorded ten times as many events, the surface network provides a more coherent delineation of the underlying structure and more accurate magnitudes for larger magnitude events. We attribute this to the enhanced focal coverage provided by the surface network and the use of broadband instrumentation. The results indicate that sparse surface networks of high quality instruments can provide rich and reliable datasets for evaluation of the impact and effectiveness of hydraulic fracture operations in regions with favorable surface noise, local stress and attenuation characteristics.
Shiogama, Hideo; Imada, Yukiko; Mori, Masato; ...
2016-08-07
Here, we describe two unprecedented large (100-member), longterm (61-year) ensembles based on MRI-AGCM3.2, which were driven by historical and non-warming climate forcing. These ensembles comprise the "Database for Policy Decision making for Future climate change (d4PDF)". We compare these ensembles to large ensembles based on another climate model, as well as to observed data, to investigate the influence of anthropogenic activities on historical changes in the numbers of record-breaking events, including: the annual coldest daily minimum temperature (TNn), the annual warmest daily maximum temperature (TXx) and the annual most intense daily precipitation event (Rx1day). These two climate model ensembles indicatemore » that human activity has already had statistically significant impacts on the number of record-breaking extreme events worldwide mainly in the Northern Hemisphere land. Specifically, human activities have altered the likelihood that a wider area globally would suffer record-breaking TNn, TXx and Rx1day events than that observed over the 2001- 2010 period by a factor of at least 0.6, 5.4 and 1.3, respectively. However, we also find that the estimated spatial patterns and amplitudes of anthropogenic impacts on the probabilities of record-breaking events are sensitive to the climate model and/or natural-world boundary conditions used in the attribution studies.« less
Proxy records of Holocene storm events in coastal barrier systems: Storm-wave induced markers
NASA Astrophysics Data System (ADS)
Goslin, Jérôme; Clemmensen, Lars B.
2017-10-01
Extreme storm events in the coastal zone are one of the main forcing agents of short-term coastal system behavior. As such, storms represent a major threat to human activities concentrated along the coasts worldwide. In order to better understand the frequency of extreme events like storms, climate science must rely on longer-time records than the century-scale records of instrumental weather data. Proxy records of storm-wave or storm-wind induced activity in coastal barrier systems deposits have been widely used worldwide in recent years to document past storm events during the last millennia. This review provides a detailed state-of-the-art compilation of the proxies available from coastal barrier systems to reconstruct Holocene storm chronologies (paleotempestology). The present paper aims (I) to describe the erosional and depositional processes caused by storm-wave action in barrier and back-barrier systems (i.e. beach ridges, storm scarps and washover deposits), (ii) to understand how storm records can be extracted from barrier and back-barrier sedimentary bodies using stratigraphical, sedimentological, micro-paleontological and geochemical proxies and (iii) to show how to obtain chronological control on past storm events recorded in the sedimentary successions. The challenges that paleotempestology studies still face in the reconstruction of representative and reliable storm-chronologies using these various proxies are discussed, and future research prospects are outlined.
NASA Astrophysics Data System (ADS)
Ramsey, M.; Nytch, C. J.; Branoff, B.
2016-12-01
Socio-hydrological studies that explore feedbacks between social and biophysical processes related to flood risk can help managers identify strategies that increase a community's freshwater security. However, knowledge uncertainty due to coarse spatio-temporal coverage of hydrological monitoring data, missing riverine discharge and precipitation records, assumptions of flood risk models, and effects of urbanization, can limit the ability of these studies to isolate hydrological responses to social drivers of flooding and a changing climate. Local experiential knowledge can provide much needed information about 1) actual flood spatio-temporal patterns, 2) human impacts and perceptions of flood events, and 3) mechanisms to validate flood risk studies and understand key social elements of the system. We addressed these knowledge gaps by comparing the location and timing of flood events described in resident interviews and resident drawn maps (total = 97) from two San Juan communities with NOAA and USGS precipitation and riverine discharge data archives, and FEMA flood maps. Analyses of five focal flood events revealed 1) riverine monitoring data failed to record a major flood event caused by localized blockage of the river, 2) residents did not mention multiple extreme riverine discharge events, 3) resident and FEMA flood maps matched closely but resident maps provided finer spatial information about frequency of flooding, and 4) only a small percentage of residents remembered the dates of flood events. Local knowledge provided valuable social data about flood impacts on human economic and physical/psychological wellbeing, perceptions about factors causing flooding, and what residents use as sources of flood information. A simple mechanism or tool for residents to record their flood experiences in real-time will address the uncertainties in local knowledge and improve social memory. The integration of local experiential knowledge with simulated and empirical hydro-meteorological data can be a powerful approach to increase the quality of socio-hydrological studies about flooding and freshwater security.
NASA Astrophysics Data System (ADS)
Timi, Purnota Hannan; Shermin, Saima; Rahman, Asifur
2017-06-01
Flight data recorder is one of the most important sources of flight data in event of aviation disaster which records a wide range of flight parameters including altitude, airspeed, heading etc. and also helps monitoring and analyzing aircraft performance. Cockpit voice recorder records radio microphone transmissions and sounds in the cockpit. These devices help to find out and understand the root causes of aircraft crashes and help building better aircraft systems and technical solutions to prevent similar type of crashes in future, which lead to improvement in safety of aircrafts and passengers. There are other devices also which enhance the aircraft safety and assists in emergency or catastrophic situations. This paper discusses the concept of Flight Data Recorder (FDR), Cockpit Voice Recorder (CVR), Underwater Locator Beacon (ULB), Data logger and flarm-collision avoidance system for aircraft and their applications in aviation.
NASA Pioneer: Venus reverse playback telemetry program TR 78-2
NASA Technical Reports Server (NTRS)
Modestino, J. W.; Daut, D. G.; Vickers, A. L.; Matis, K. R.
1978-01-01
During the entry of the Pioneer Venus Atmospheric Probes into the Venus atmosphere, there were several events (RF blackout and data rate changes) which caused the ground receiving equipment to lose lock on the signal. This caused periods of data loss immediately following each one of these disturbing events which lasted until all the ground receiving units (receiver, subcarrier demodulator, symbol synchronizer, and sequential decoder) acquired lock once more. A scheme to recover these data by off-line data processing was implemented. This scheme consisted of receiving the S band signals from the probes with an open loop reciever (requiring no lock up on the signal) in parallel with the closed loop receivers of the real time receiving equipment, down converting the signals to baseband, and recording them on an analog recorder. The off-line processing consisted of playing the analog recording in the reverse direction (starting with the end of the tape) up, converting the signal to S-band, feeding the signal into the "real time" receiving system and recording on digital tape, the soft decisions from the symbol synchronizer.
Myrow, P.M.; Strauss, J.V.; Creveling, J.R.; Sicard, K.R.; Ripperdan, R.; Sandberg, C.A.; Hartenfels, S.
2011-01-01
New carbon isotopic data from upper Famennian deposits in the western United States reveal two previously unrecognized major positive isotopic excursions. The first is an abrupt ~. 3??? positive excursion, herein referred to as ALFIE (A Late Famennian Isotopic Excursion), recorded in two sections of the Pinyon Peak Limestone of north-central Utah. Integration of detailed chemostratigraphic and biostratigraphic data suggests that ALFIE is the Laurentian record of the Dasberg Event, which has been linked to transgression in Europe and Morocco. Sedimentological data from the Chaffee Group of western Colorado also record transgression at a similar biostratigraphic position, with a shift from restricted to open-marine lithofacies. ALFIE is not evident in chemostratigraphic data from age-equivalent strata in Germany studied herein and in southern Europe, either because it is a uniquely North American phenomenon, or because the German sections are too condensed relative to those in Laurentia. A second positive carbon isotopic excursion from the upper Chaffee Group of Colorado is recorded in transgressive strata deposited directly above a previously unrecognized paleokarst interval. The age of this excursion, and the duration of the associated paleokarst hiatus, are not well constrained, although the events occurred sometime after the Late Famennian Middle expansa Zone. The high positive values recorded in this excursion are consistent with those associated with the youngest Famennian Middle to Late praesulcata Hangenberg Isotopic Excursion in Europe, the isotopic expression of the Hangenberg Event, which included mass extinction, widespread black shale deposition, and a glacio-eustatic fall and rise. If correct, this would considerably revise the age of the Upper Chaffee Group strata of western Colorado. ?? 2011 Elsevier B.V.
Data-Driven Information Extraction from Chinese Electronic Medical Records
Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q.
2015-01-01
Objective This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Materials and Methods Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. Results The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. Discussion In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). Conclusions The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica. PMID:26295801
Data-Driven Information Extraction from Chinese Electronic Medical Records.
Xu, Dong; Zhang, Meizhuo; Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q
2015-01-01
This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica.
Benchmarking dairy herd health status using routinely recorded herd summary data.
Parker Gaddis, K L; Cole, J B; Clay, J S; Maltecca, C
2016-02-01
Genetic improvement of dairy cattle health through the use of producer-recorded data has been determined to be feasible. Low estimated heritabilities indicate that genetic progress will be slow. Variation observed in lowly heritable traits can largely be attributed to nongenetic factors, such as the environment. More rapid improvement of dairy cattle health may be attainable if herd health programs incorporate environmental and managerial aspects. More than 1,100 herd characteristics are regularly recorded on farm test-days. We combined these data with producer-recorded health event data, and parametric and nonparametric models were used to benchmark herd and cow health status. Health events were grouped into 3 categories for analyses: mastitis, reproductive, and metabolic. Both herd incidence and individual incidence were used as dependent variables. Models implemented included stepwise logistic regression, support vector machines, and random forests. At both the herd and individual levels, random forest models attained the highest accuracy for predicting health status in all health event categories when evaluated with 10-fold cross-validation. Accuracy (SD) ranged from 0.61 (0.04) to 0.63 (0.04) when using random forest models at the herd level. Accuracy of prediction (SD) at the individual cow level ranged from 0.87 (0.06) to 0.93 (0.001) with random forest models. Highly significant variables and key words from logistic regression and random forest models were also investigated. All models identified several of the same key factors for each health event category, including movement out of the herd, size of the herd, and weather-related variables. We concluded that benchmarking health status using routinely collected herd data is feasible. Nonparametric models were better suited to handle this complex data with numerous variables. These data mining techniques were able to perform prediction of health status and could add evidence to personal experience in herd management. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Love, Jeffrey J.
2012-01-01
Statistical analysis is made of rare, extreme geophysical events recorded in historical data -- counting the number of events $k$ with sizes that exceed chosen thresholds during specific durations of time $\\tau$. Under transformations that stabilize data and model-parameter variances, the most likely Poisson-event occurrence rate, $k/\\tau$, applies for frequentist inference and, also, for Bayesian inference with a Jeffreys prior that ensures posterior invariance under changes of variables. Frequentist confidence intervals and Bayesian (Jeffreys) credibility intervals are approximately the same and easy to calculate: $(1/\\tau)[(\\sqrt{k} - z/2)^{2},(\\sqrt{k} + z/2)^{2}]$, where $z$ is a parameter that specifies the width, $z=1$ ($z=2$) corresponding to $1\\sigma$, $68.3\\%$ ($2\\sigma$, $95.4\\%$). If only a few events have been observed, as is usually the case for extreme events, then these "error-bar" intervals might be considered to be relatively wide. From historical records, we estimate most likely long-term occurrence rates, 10-yr occurrence probabilities, and intervals of frequentist confidence and Bayesian credibility for large earthquakes, explosive volcanic eruptions, and magnetic storms.
Schweizer, C; Ramseyer, A; Gerber, V; Christen, G; Burger, D; Wohlfender, F D
2016-11-01
Racetrack injuries are of welfare concern and the prevention of injuries is an important goal in many racing jurisdictions. Over the years this has led to more detailed recording of clinical events on racecourses. However, risk factor analyses of clinical events at race meetings have not been previously reported for Switzerland. To identify discipline-specific factors that influence the occurrence of clinical events during race meetings with the ultimate aim of improving the monitoring and safety of racetracks in Switzerland and optimising racehorse welfare. Retrospective study of horse race data collected by the Swiss horse racing association. All race starts (n = 17,670, including 6198 flat, 1257 obstacle and 10,215 trot race starts) recorded over a period of 4 years (2009-2012) were analysed in multivariable mixed effect logistic regression models including horse and racecourse related data. The models were designed to identify discipline-specific factors influencing the occurrence of clinical events on racecourses in Switzerland. Factors influencing the risk of clinical events during races were different for each discipline. The risk of a clinical event in trot racing was lower for racing on a Porphyre sand track than on grass tracks. Horses whose driver was also their trainer had an approximately 2-fold higher risk for clinical events. In obstacle races, longer distances (2401-3300 m and 3301-5400 m, respectively) had a protective effect compared with racing over shorter distances. In flat racing, 5 racecourses reported significantly fewer clinical events. In all 3 disciplines, finishing 8th place or later was associated with clinical events. Changes in management that aim to improve the safety and welfare of racehorses, such as racetrack adaptations, need to be individualised for each discipline. © 2015 EVJ Ltd.
The data acquisition of OLGA II; An application of the PSI TANDEM system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jost, D.T.; Vermeulen, D.
1992-04-01
This paper describes the data acquisition of the on-line gas-chemistry apparatus (OLGA II). OLGA II is used to investigate the chemical behavior of volatile molecules of short lives isotopes. Special emphasis is given to the presentation of the PSI tandem data acquisition system used in these experiments. Time stamped event-by-event recording is used to follow radioactive decay chains.
49 CFR 229.135 - Event recorders.
Code of Federal Regulations, 2010 CFR
2010-10-01
... an event recorder with a certified crashworthy event recorder memory module that meets the... certified crashworthy event recorder memory module that meets the requirements of Appendix D of this part. The certified event recorder memory module shall be mounted for its maximum protection. (Although...
Index files for Belle II - very small skim containers
NASA Astrophysics Data System (ADS)
Sevior, Martin; Bloomfield, Tristan; Kuhr, Thomas; Ueda, I.; Miyake, H.; Hara, T.
2017-10-01
The Belle II experiment[1] employs the root file format[2] for recording data and is investigating the use of “index-files” to reduce the size of data skims. These files contain pointers to the location of interesting events within the total Belle II data set and reduce the size of data skims by 2 orders of magnitude. We implement this scheme on the Belle II grid by recording the parent file metadata and the event location within the parent file. While the scheme works, it is substantially slower than a normal sequential read of standard skim files using default root file parameters. We investigate the performance of the scheme by adjusting the “splitLevel” and “autoflushsize” parameters of the root files in the parent data files.
NASA Astrophysics Data System (ADS)
Erhardt, T.; Capron, E.; Rasmussen, S.; Schuepbach, S.; Bigler, M.; Fischer, H.
2017-12-01
During the last glacial period proxy records throughout the Northern Hemisphere document a succession of rapid millennial-scale warming events, called Dansgaard Oeschger (DO) events. Marine proxy records from the Atlantic also reveal, that some of the warming events where preceded by large ice rafting events, referred to as Heinrich events. Different mechanisms have been proposed, that can produce DO-like warming in model experiments, however the progression and plausible trigger of the events and their possible interplay with the Heinrich events is still unknown. Because of their fast nature, the progression is challenging to reconstruct from paleoclimate data due to the temporal resolution achievable in many archives and cross-dating uncertainties between records. We use new high-resolution multi-proxy records of sea-salt and terrestrial aerosol concentrations over the period 10-60 ka from two Greenland deep ice cores in conjunction with local precipitation and temperature proxy records from one of the cores to investigate the progression of environmental changes at the onset of the individual warming events. The timing differences are then used to explore whether the DO warming events that terminate Heinrich-Stadials progressed differently in comparison to those after Non-Heinrich-Stadials. Our analysis indicates no difference in the progression of the warming terminating Heinrich-Stadials and Non-Heinrich-Stadials. Combining the evidence from all warming events in the period, our analysis shows a consistent lead of the changes in both local precipitation and terrestrial dust aerosol concentrations over the change in sea-salt aerosol concentrations and local temperature by approximately one decade. This implies that both the moisture transport to Greenland and the intensity of the Asian winter monsoon changed before the sea-ice cover in the North Atlantic was reduced, rendering a collapse of the sea-ice cover as a trigger for the DO events unlikely.
The ATLAS Eventlndex: data flow and inclusion of other metadata
NASA Astrophysics Data System (ADS)
Barberis, D.; Cárdenas Zárate, S. E.; Favareto, A.; Fernandez Casani, A.; Gallas, E. J.; Garcia Montoro, C.; Gonzalez de la Hoz, S.; Hrivnac, J.; Malon, D.; Prokoshin, F.; Salt, J.; Sanchez, J.; Toebbicke, R.; Yuan, R.; ATLAS Collaboration
2016-10-01
The ATLAS EventIndex is the catalogue of the event-related metadata for the information collected from the ATLAS detector. The basic unit of this information is the event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex is event picking, as well as data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the Grid, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalogue AMI and the Rucio data management system and information on production jobs from the ATLAS production system. The ATLAS production system is also used for the collection of event information from the Grid jobs. EventIndex developments started in 2012 and in the middle of 2015 the system was commissioned and started collecting event metadata, as a part of ATLAS Distributed Computing operations.
Extreme event statistics in a drifting Markov chain
NASA Astrophysics Data System (ADS)
Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur
2017-07-01
We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.
Serebruany, Victor L; Atar, Dan
2012-09-01
Central adjudication in randomised controlled outcome-driven trials represents a traditional approach to maintain data integrity by applying uniformed rules for assessment of clinical events. It was the purpose of this investigation to determine the patterns of myocardial infarction (MI) adjudication in the TRITON, RECORD, and PLATO trials. We were matching centrally-adjudicated MI's (CAMI's) from the official trial publication with the site-reported MI (SRMI's) count from the Food and Drug Administration's secondary analyses for the investigational compounds prasugrel (TRITON), rosiglitazone (RECORD), and ticagrelor (PLATO). CAMI numbers showed a remarkable discrepancy to SRMI's by more than a doubling of the difference: from 72 to 145 events in TRITON favoring prasugrel (from a hazard ratio [HR]=0.76, p=0.08; to a HR=0.76, p<0.001), and from 44 to 89 events in favour of ticagrelor in PLATO (from a HR=0.94, p=0.095; to a HR=0.84, p<0.001). In contrast, in the RECORD trial, the CAMI count was less than the SRMI count (from 24 to 8 events, from a HR=1.42, p=0.93; to a HR=1.14, p=0.96), in this case diminishing cardiovascular hazards in favour of rosiglitazone. In conclusion, central adjudication in the TRITON, the RECORD, and the PLATO trial turned out to have a critical impact on study outcomes. Trial publications should in the future include site-reported major efficacy and safety endpoints to preserve data integrity. The regulatory authorities should consider independent audits when there is a major disagreement between centrally adjudicated and site reported events influencing the results of a major clinical trial.
Distributed Data Collection for the ATLAS EventIndex
NASA Astrophysics Data System (ADS)
Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.
2015-12-01
The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.
Site effects in Avcilar, West of Istanbul, Turkey, from strong- and weak-motion data
Ozel, O.; Cranswick, E.; Meremonte, M.; Erdik, M.; Safak, E.
2002-01-01
Approximately 1000 people were killed in the collapse of buildings in Istanbul, Turkey, during the 17 August 1999 I??zmit earthquake, whose epicenter was roughly 90 km east of the city. Most of the fatalities and damage occurred in the suburb of Avcilar that is 20 km further west of the epicenter than the city proper. To investigate this pattern of damage, the U.S. Geological Survey, in cooperation with the Kandilli Observatory and Earthquake Research Institute (KOERI), deployed portable digital seismographs at seven free-field sites in western Istanbul, to record aftershocks during the period from 24 August to 2 September. The primary objective of this deployment was to study the site effects by comparing the aftershock ground motions recorded at sites inside and outside the damaged area, and to correlate site effects with the distribution of the damaged buildings. In addition to using weak-motion data, mainshock and aftershock acceleration records from the KOERI permanent strong-motion array were also used in estimating the site effects. Site effects were estimated using S waves from both types of records. For the weak-motion data set, 22 events were selected according to the criteria of signal-to-noise ratio (S/N ratio) and the number of stations recording the same event. The magnitudes of these events ranged from 3.0 to 5.2. The acceleration data set consisted of 12 events with magnitudes ranging from 4.3 to 5.8 and included two mainshock events. Results show that the amplifying frequency band is, in general, less than 4 Hz, and the physical properties of the geologic materials are capable of amplifying the motions by a factor of 5-10. In this frequency band, there is a good agreement among the spectral ratios obtained from the two mainshocks and their aftershocks. The damage pattern for the 17 August I??zmit earthquake is determined by several factors. However, our study suggests that the site effects in Avcilar played an important role in contributing to the damage.
NASA Astrophysics Data System (ADS)
Barnet, J.; Littler, K.; Kroon, D.; Leng, M. J.; Westerhold, T.; Roehl, U.; Zachos, J. C.
2017-12-01
The "greenhouse" world of the latest Cretaceous-Early Paleogene ( 70-34 Ma) was characterised by multi-million year variability in climate and the carbon-cycle. Throughout this interval the pervasive imprint of orbital-cyclicity, particularly eccentricity and precession, is visible in elemental and stable isotope data obtained from multiple deep-sea sites. Periodic "hyperthermal" events, occurring largely in-step with these orbital cycles, have proved particularly enigmatic, and may be the closest, albeit imperfect, analogues for anthropogenic climate change. This project utilises CaCO3-rich marine sediments recovered from ODP Site 1262 at a paleo-depth of 3600 m on the Walvis Ridge, South Atlantic, of late Maastrichtian-mid Paleocene age ( 67-60 Ma). We have derived high-resolution (2.5-4 kyr) carbon and oxygen isotope data from the epifaunal benthic foraminifera species Nuttallides truempyi. Combining the new record with the existing Late Paleocene-Early Eocene record generated from the same site by Littler et al. (2014), yields a single-site reference curve detailing 13.5 million years of orbital cyclicity in paleoclimate and carbon cycle from the latest Cretaceous to near the peak warmth of the Early Paleogene greenhouse. Spectral analysis of this new combined dataset allows us to identify long (405-kyr) eccentricity, short (100-kyr) eccentricity, and precession (19-23-kyr) as the principle forcing mechanisms governing pacing of the background climate and carbon-cycle during this time period, with a comparatively weak obliquity (41-kyr) signal. Cross-spectral analysis suggests that changes in climate lead the carbon cycle throughout most of the record, emphasising the role of the release of temperature-sensitive carbon stores as a positive feedback to an initial warming induced by changes in orbital configuration. The expression of comparatively understudied Early Paleocene events, including the Dan-C2 Event, Latest Danian Event, and Danian/Selandian Transition Event, are also identified within this new record, confirming the global nature and orbital pacing of the Latest Danian Event and Danian/Selandian Transition Event, but questioning the Dan-C2 event as a global hyperthermal.
NASA Astrophysics Data System (ADS)
O'Loingsigh, T.; McTainsh, G. H.; Tews, E. K.; Strong, C. L.; Leys, J. F.; Shinkfield, P.; Tapper, N. J.
2014-03-01
Wind erosion of soils is a natural process that has shaped the semi-arid and arid landscapes for millennia. This paper describes the Dust Storm Index (DSI); a methodology for monitoring wind erosion using Australian Bureau of Meteorology (ABM) meteorological observational data since the mid-1960s (long-term), at continental scale. While the 46 year length of the DSI record is its greatest strength from a wind erosion monitoring perspective, there are a number of technical challenges to its use because when the World Meteorological Organisation (WMO) recording protocols were established the use of the data for wind erosion monitoring was never intended. Data recording and storage protocols are examined, including the effects of changes to the definition of how observers should interpret and record dust events. A method is described for selecting the 180 long-term ABM stations used in this study and the limitations of variable observation frequencies between stations are in part resolved. The rationale behind the DSI equation is explained and the examples of temporal and spatial data visualisation products presented include; a long term national wind erosion record (1965-2011), continental DSI maps, and maps of the erosion event types that are factored into the DSI equation. The DSI is tested against dust concentration data and found to provide an accurate representation of wind erosion activity. As the ABM observational records used here were collected according to WMO protocols, the DSI methodology could be used in all countries with WMO-compatible meteorological observation and recording systems.
NASA Astrophysics Data System (ADS)
Yoo, Chulsang; Park, Minkyu; Kim, Hyeon Jun; Choi, Juhee; Sin, Jiye; Jun, Changhyun
2015-01-01
In this study, the analysis of documentary records on the storm events in the Annals of the Choson Dynasty, covering the entire period of 519 years from 1392 to 1910, was carried out. By applying various key words related to storm events, a total of 556 documentary records could be identified. The main objective of this study was to develop rules of classification for the documentary records on the storm events in the Annals of the Choson Dynasty. The results were also compared with the rainfall data of the traditional Korean rain gauge, named Chukwooki, which are available from 1777 to 1910 (about 130 years). The analysis is organized as follows. First, the frequency of the documents, their length, comments about the size of the inundated area, the number of casualties, the number of property losses, and the size of the countermeasures, etc. were considered to determine the magnitude of the events. To this end, rules of classification of the storm events are developed. Cases in which the word 'disaster' was used along with detailed information about the casualties and property damages, were classified as high-level storm events. The high-level storm events were additionally sub-categorized into catastrophic, extreme, and severe events. Second, by applying the developed rules of classification, a total of 326 events were identified as high-level storm events during the 519 years of the Choson Dynasty. Among these high-level storm events, only 19 events were then classified as the catastrophic ones, 106 events as the extreme ones, and 201 events as the severe ones. The mean return period of these storm events was found to be about 30 years for the catastrophic events, 5 years for the extreme events, and 2-3 years for the severe events. Third, the classification results were verified considering the records of the traditional Korean rain gauge; it was found that the catastrophic events are strongly distinguished from other events with a mean total rainfall and a storm duration equal to 439.8 mm and 49.3 h, respectively. The return period of these catastrophic events was also estimated to be in the range 100-500 years.
An iterative matching and locating technique for borehole microseismic monitoring
NASA Astrophysics Data System (ADS)
Chen, H.; Meng, X.; Niu, F.; Tang, Y.
2016-12-01
Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. The success of hydraulic fracturing monitoring relies on the detection and characterization (i.e., location and focal mechanism estimation) of a maximum number of induced microseismic events. All the events are important to quantify the stimulated reservior volume (SRV) and characterize the newly created fracture network. Detecting and locating low magnitude events, however, are notoriously difficult, particularly at a high noisy production environment. Here we propose an iterative matching and locating technique (iMLT) to obtain a maximum detection of small events and the best determination of their locations from continuous data recorded by a single azimuth downhole geophone array. As the downhole array is located in one azimuth, the regular M&L using the P-wave cross-correlation only is not able to resolve the location of a matched event relative to the template event. We thus introduce the polarization direction in the matching, which significantly improve the lateral resolution of the M&L method based on numerical simulations with synthetic data. Our synthetic tests further indicate that the inclusion of S-wave cross-correlation data can help better constrain the focal depth of the matched events. We apply this method to a dataset recorded during hydraulic fracturing treatment of a pilot horizontal well within the shale play in southwest China. Our approach yields a more than fourfold increase in the number of located events, compared with the original event catalog from traditional downhole processing.
Sensor readout detector circuit
Chu, Dahlon D.; Thelen, Jr., Donald C.
1998-01-01
A sensor readout detector circuit is disclosed that is capable of detecting sensor signals down to a few nanoamperes or less in a high (microampere) background noise level. The circuit operates at a very low standby power level and is triggerable by a sensor event signal that is above a predetermined threshold level. A plurality of sensor readout detector circuits can be formed on a substrate as an integrated circuit (IC). These circuits can operate to process data from an array of sensors in parallel, with only data from active sensors being processed for digitization and analysis. This allows the IC to operate at a low power level with a high data throughput for the active sensors. The circuit may be used with many different types of sensors, including photodetectors, capacitance sensors, chemically-sensitive sensors or combinations thereof to provide a capability for recording transient events or for recording data for a predetermined period of time following an event trigger. The sensor readout detector circuit has applications for portable or satellite-based sensor systems.
Sensor readout detector circuit
Chu, D.D.; Thelen, D.C. Jr.
1998-08-11
A sensor readout detector circuit is disclosed that is capable of detecting sensor signals down to a few nanoamperes or less in a high (microampere) background noise level. The circuit operates at a very low standby power level and is triggerable by a sensor event signal that is above a predetermined threshold level. A plurality of sensor readout detector circuits can be formed on a substrate as an integrated circuit (IC). These circuits can operate to process data from an array of sensors in parallel, with only data from active sensors being processed for digitization and analysis. This allows the IC to operate at a low power level with a high data throughput for the active sensors. The circuit may be used with many different types of sensors, including photodetectors, capacitance sensors, chemically-sensitive sensors or combinations thereof to provide a capability for recording transient events or for recording data for a predetermined period of time following an event trigger. The sensor readout detector circuit has applications for portable or satellite-based sensor systems. 6 figs.
2011-06-27
Development Generic Hull Testing Airbag and Sensor Technology Development Blast Data Recorder Specifications and Fielding Numerical Model Improvement...seat designs, airbag and restraint systems, and energy absorbing flooring solutions Vehicle event data recorders for collecting highly accurate...treatments. Airbag or comparable technologies such as bolsters. Sensors that can detect and deploy/trigger interior treatments within the timeframe of a
Coulter, D M
2001-12-01
The purpose of this paper is to describe how the New Zealand (NZ) Intensive Medicines Monitoring Programme (IMMP) functions in relation to NZ privacy laws and to describe the attitudes of patients to drug safety monitoring and the privacy of their personal and health information. The IMMP undertakes prospective observational event monitoring cohort studies on new drugs. The cohorts are established from prescription data and the events are obtained using prescription event monitoring and spontaneous reporting. Personal details, prescribing history of the monitored drugs and adverse events data are stored in databases long term. The NZ Health Information Privacy Code is outlined and the monitoring of sumatriptan is used to illustrate how the IMMP functions in relation to the Code. Patient responses to the programme are described. Sumatriptan was monitored in 14,964 patients and 107,646 prescriptions were recorded. There were 2344 reports received describing 3987 adverse events. A majority of the patients were involved in the recording of events data either personally or by telephone interview. There were no objections to the monitoring process on privacy grounds. Given the fact that all reasonable precautions are taken to ensure privacy, patients perceive drug safety to have greater priority than any slight risk of breach of confidentiality concerning their personal details and health information.
NASA Astrophysics Data System (ADS)
Fovet, O.; Humbert, G.; Dupas, R.; Gascuel-Odoux, C.; Gruau, G.; Jaffrezic, A.; Thelusma, G.; Faucheux, M.; Gilliet, N.; Hamon, Y.; Grimaldi, C.
2018-04-01
The response of stream chemistry to storm is of major interest for understanding the export of dissolved and particulate species from catchments. The related challenge is the identification of active hydrological flow paths during these events and of the sources of chemical elements for which these events are hot moments of exports. An original four-year data set that combines high frequency records of stream flow, turbidity, nitrate and dissolved organic carbon concentrations, and piezometric levels was used to characterize storm responses in a headwater agricultural catchment. The data set was used to test to which extend the shallow groundwater was impacting the variability of storm responses. A total of 177 events were described using a set of quantitative and functional descriptors related to precipitation, stream and groundwater pre-event status and event dynamics, and to the relative dynamics between water quality parameters and flow via hysteresis indices. This approach led to identify different types of response for each water quality parameter which occurrence can be quantified and related to the seasonal functioning of the catchment. This study demonstrates that high-frequency records of water quality are precious tools to study/unique in their ability to emphasize the variability of catchment storm responses.
NASA Astrophysics Data System (ADS)
Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo
2017-06-01
The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.
NASA Astrophysics Data System (ADS)
Thorndycraft, V. R.; Benito, G.; Barriendos, M.; Rico, M.; Sánchez-Moya, Y.; Sopeña, A.; Casas, A.
2009-09-01
Palaeoflood hydrology is the reconstruction of flood magnitude and frequency using geomorphological flood evidence and is particularly valuable for extending the record of extreme floods prior to the availability of instrumental data series. This paper will provide a review of recent developments in palaeoflood hydrology and will be presented in three parts: 1) an overview of the key methodological approaches used in palaeoflood hydrology and the use of historical documentary evidence for reconstructing extreme events; 2) a summary of the Llobregat River palaeoflood case study (Catalonia, NE Spain); and 3) analysis of the AD 1617 flood and its impacts across Catalonia (including the rivers Llobregat, Ter and Segre). The key findings of the Llobregat case study were that at least eight floods occurred with discharges significantly larger than events recorded in the instrumental record, for example at the Pont de Vilomara study reach the palaeodischarges of these events were 3700-4300 m3/s compared to the 1971 flood, the largest on record, of 2300 m3/s. Five of these floods were dated to the last 3000 years and the three events directly dated by radiocarbon all occurred during cold phases of global climate. Comparison of the palaeoflood record with documentary evidence indicated that one flood, radiocarbon dated to cal. AD 1540-1670, was likely to be the AD 1617 event, the largest flood of the last 700 years. Historical records indicate that this event was caused by rainfall occurring from the 2nd to 6th November and the resultant flooding caused widespread socio-economic impacts including the destruction of at least 389 houses, 22 bridges and 17 water mills. Discharges estimated from palaeoflood records and historical flood marks indicate that the Llobregat (4680 m3/s) and Ter (2700-4500 m3/s) rivers witnessed extreme discharges in comparison to observed floods in the instrumental record (2300 and 2350 m3/s, respectively); whilst further east in the Segre River there was no geomorphic evidence of any flooding of greater magnitude than 2000 m3/s, or the 1982 event.
Watemberg, Nathan; Tziperman, Barak; Dabby, Ron; Hasan, Mariana; Zehavi, Liora; Lerman-Sagie, Tally
2005-05-01
To report on the usefulness of adding video recording to routine EEG studies of infants and children with frequent paroxysmal events. We analyzed the efficacy of this diagnostic means during a 4-year period. The decision whether to add video recording was made by the pediatric EEG interpreter at the time of the study. Studies were planned to last between 20 and 30 min, and, if needed, were extended by the EEG interpreter. For most studies, video recording was added from the beginning of EEG recording. In a minority of cases, the addition of video was implemented during the first part of the EEG test, as clinical events became obvious. In these cases, a new study (file) was begun. The success rate was analyzed according to the indications for the EEG study: paroxysmal eye movements, tremor, suspected seizures, myoclonus, staring episodes, suspected stereotypias and tics, absence epilepsy follow-up, cyanotic episodes, and suspected psychogenic nonepileptic events. Video recording was added to 137 of 666 routine studies. Mean patient age was 4.8 years. The nature of the event was determined in 61 (45%) of the EEG studies. Twenty-eight percent were hospitalized patients. The average study duration was 26 min. This diagnostic means was particularly useful for paroxysmal eye movements, staring spells, myoclonic jerks, stereotypias, and psychogenic nonepileptic events. About 46% of 116 patients for whom cognitive data were available were mentally retarded. EEG with added video recording was successfully performed in all 116 cases and provided useful information in 29 (55%) of these 53 patients. Adding video recording to routine EEG was helpful in 45% of cases referred for frequent paroxysmal events. This technique proved useful for hospitalized children as well as for outpatients. Moreover, it was successfully applied in cognitively impaired patients. Infants and children with paroxysmal eye movements, staring spells, myoclonic jerks, stereotypias, and pseudoseizures especially benefited from this diagnostic means. Because of its low cost and the little discomfort imposed on the patient and his or her family, this technique should be considered as a first diagnostic step in children with frequent paroxysmal events.
Su, Kyung-Min; Hairston, W David; Robbins, Kay
2018-01-01
In controlled laboratory EEG experiments, researchers carefully mark events and analyze subject responses time-locked to these events. Unfortunately, such markers may not be available or may come with poor timing resolution for experiments conducted in less-controlled naturalistic environments. We present an integrated event-identification method for identifying particular responses that occur in unlabeled continuously recorded EEG signals based on information from recordings of other subjects potentially performing related tasks. We introduce the idea of timing slack and timing-tolerant performance measures to deal with jitter inherent in such non-time-locked systems. We have developed an implementation available as an open-source MATLAB toolbox (http://github.com/VisLab/EEG-Annotate) and have made test data available in a separate data note. We applied the method to identify visual presentation events (both target and non-target) in data from an unlabeled subject using labeled data from other subjects with good sensitivity and specificity. The method also identified actual visual presentation events in the data that were not previously marked in the experiment. Although the method uses traditional classifiers for initial stages, the problem of identifying events based on the presence of stereotypical EEG responses is the converse of the traditional stimulus-response paradigm and has not been addressed in its current form. In addition to identifying potential events in unlabeled or incompletely labeled EEG, these methods also allow researchers to investigate whether particular stereotypical neural responses are present in other circumstances. Timing-tolerance has the added benefit of accommodating inter- and intra- subject timing variations. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Stochastic generation of hourly rainstorm events in Johor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nojumuddin, Nur Syereena; Yusof, Fadhilah; Yusop, Zulkifli
2015-02-03
Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was usedmore » in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972–2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor.« less
Rainfall continuous time stochastic simulation for a wet climate in the Cantabric Coast
NASA Astrophysics Data System (ADS)
Rebole, Juan P.; Lopez, Jose J.; Garcia-Guzman, Adela
2010-05-01
Rain is the result of a series of complex atmospheric processes which are influenced by numerous factors. This complexity makes its simulation practically unfeasible from a physical basis, advising the use of stochastic diagrams. These diagrams, which are based on observed characteristics (Todorovic and Woolhiser, 1975), allow the introduction of renewal alternating processes, that account for the occurrence of rainfall at different time lapses (Markov chains are a particular case, where lapses can be described by exponential distributions). Thus, a sequential rainfall process can be defined as a temporal series in which rainfall events (periods in which rainfall is recorded) alternate with non rain events (periods in which no rainfall is recorded). The variables of a temporal rain sequence have been characterized (duration of the rainfall event, duration of the non rainfall event, average intensity of the rain in the rain event, and a temporal distribution of the amount of rain in the rain event) in a wet climate such as that of the coastal area of Guipúzcoa. The study has been performed from two series recorded at the meteorological stations of Igueldo-San Sebastián and Fuenterrabia / Airport (data every ten minutes and for its hourly aggregation). As a result of this work, the variables satisfactorily fitted the following distribution functions: the duration of the rain event to a exponential function; the duration of the dry event to a truncated exponential mixed distribution; the average intensity to a Weibull distribution; and the distribution of the rain fallen to the Beta distribution. The characterization was made for an hourly aggregation of the recorded interval of ten minutes. The parameters of the fitting functions were better obtained by means of the maximum likelihood method than the moment method. The parameters obtained from the characterization were used to develop a stochastic rainfall process simulation model by means of a three states Markov chain (Hutchinson, 1990), performed in an hourly basis by García-Guzmán (1993) and Castro et al. (1997, 2005 ). Simulation process results were valid in the hourly case for all the four described variables, with a slightly better response in Fuenterrabia than in Igueldo. In summary, all the variables were better simulated in Fuenterrabia than in Igueldo. Fuenterrabia data series is shorter and with longer sequences without missing data, compared to Igueldo. The latter shows higher number of missing data events, whereas its mean duration is longer in Fuenterrabia.
A 2000-year palaeoflood record from northwest England from lake sediments
NASA Astrophysics Data System (ADS)
Schillereff, Daniel; Chiverrell, Richard; Macdonald, Neil; Hooke, Janet
2014-05-01
Greater insight into the relationship between climatic fluctuations and the frequency and magnitude of precipitation events over recent centuries is crucial in the context of future warming and projected intensification of hydrological extremes. However, the detection of trends in flood frequency and intensity is not a straightforward task as conventional flood series derived from instrumental sources rarely span sufficiently long timescales to capture the most extreme events. Usefully, the geomorphic effects of extreme hydrological events can be effectively recorded in upland lake basins as efficient sediment trapping preserves discharge-related proxy indicators (e.g., particle size). Provided distinct sedimentary signatures of historic floods are discernable and the sediment sequence can be well-constrained in time, these lacustrine archives offer a valuable data resource. We demonstrate that a series of sediment cores (3 - 5 m length) from Brotherswater, northwest England, contain numerous coarse-grained laminations, discerned by applying high-resolution (0.5 cm) laser granulometry, which are interpreted as reflecting a palaeoflood record extending to ~2000 yr BP. The presence of thick facies which exhibit inverse grading underlying normal grading, most likely reflecting the waxing and waning of flood-induced hyperpycnal flows, supports our palaeoflood interpretation. Data from an on-going sediment trapping protocol at Brotherswater that shows a relationship between river discharge (recorded via short-term lake level change representing flood events) and the calibre of particles captured in the traps lends further support to our interpretation. Well-constrained chronologies were constructed for the cores through integrating radionuclide (210Pb, 137Cs, 241Am, 14C) dating within a Bayesian age-depth modelling protocol. Geochemical markers of known-age that reflect phases of local point-source lead (Pb) mining were used to resolve time periods where radiocarbon dates returned multiple possible age solutions. We subsequently build a regression model using the time-window where recorded river discharge and the sedimentary record overlap (1961-2013) in order to reconstruct discharge estimates for the palaeoflood laminations. These quantitative palaeoflood data can thus be inserted into statistical flood frequency analyses and compared with outputs using instrumental data and regional flood information.
NASA Astrophysics Data System (ADS)
Gutierrez-Pastor, J.; Nelson, C. H.; Goldfinger, C.; Johnson, J.
2005-05-01
Marine turbidite stratigraphy, onshore paleoseismic records of tsunami sand beds and co-seismic subsidence (Atwater and Hemphill-Haley, 1997; Kelsey et al., 2002; Witter et al., 2003) and tsunami sands of Japan (Satake et al., 1996) all show evidence for great earthquakes (M ~ 9) on the Cascadia Subduction Zone. When a great earthquake shakes 1000 kilometers of the Cascadia margin, sediment failures occur in all tributary canyons and resulting turbidity currents travel down the canyon systems and deposit synchronous turbidites in abyssal seafloor channels. These turbidite records provide a deepwater paleoseismic record of great earthquakes. An onshore paleoseismic record develops from rapid coseismic subsidence resulting in buried marshes and drowned forests, and subsequent tsunami sand layer deposition. The Cascadia Basin provides the longest paleoseismic record of great earthquakes that is presently available for a subduction zone. A total of 17 synchronous turbidites have deposited along ~700 km of the Cascadia margin during the Holocene time of ~10,000 cal yr. Because the youngest paleoseismic event in all turbidite and onshore records is 300 AD, the average recurrence interval of Great Earthquakes is ~ 600 yr. At least 6 smaller events have also ruptured shorter margin segments. Linkage of the rupture length of these events comes from relative dating tools such as the "confluence test" of Adams (1990), radiocarbon ages of onshore and offshore events and physical property correlation of individual event "signatures". We use both 14C ages and analysis of hemipelagic sediment thickness between turbidites (H), where H/sedimentation rate = time between turbidite events to develop two recurrence histories. Utilizing the most reliable 14C and hemipelagic data sets from turbidites for the past ~ 5000 yr, the minimum recurrence time is ~ 300 yr and maximum time is ~ 1300 yr. There also is a recurrence pattern through the entire Holocene that consists of a long time interval followed by 2 to 5 short intervals that is apparent as well in the land records. This pattern has repeated five times in the Holocene. Both onshore paleoseismic records and turbidite synchroneity for hundreds of kilometers, suggest that the Holocene turbidite record of the Cascadia Subduction Zone is caused dominantly by triggering of great earthquakes similar in rupture length to the Sumatra 2004 earthquake. The recent Sumatra subduction zone great earthquake of 2004 and the 1700 AD Cascadia tsunami sand of 3m height preserved in Japan (Satake et al., 1996) show that ocean-basin wide tsunami deposits result from these great earthquakes, which rupture the seafloor for hundreds of kilometers. Cascadia and Sumatra share many geological and physiographic similarities that favor the deposition of turbidites from great earthquakes, and tend to filter non earthquake turbidites from the record. Thus the paleoseismic methods developed in Cascadia could be applied to the Sumatran Subduction Zone and we expect that the turbidite record would yield a similar record ~10,000 yr in length. In Sumatra, the dearth of such records led to the lack of widespread recognition of the hazard, particularly from the northern Sumatra and Andaman-Nicobar region where geodetic data suggested weak plate locking. Evidence of a tsunami similar to the 2004 event from satellite imagery suggests the previous event was in the recent past.
The 03 April 2017 Botswana M6.5 earthquake: Preliminary results
NASA Astrophysics Data System (ADS)
Midzi, Vunganai; Saunders, I.; Manzunzu, B.; Kwadiba, M. T.; Jele, V.; Mantsha, R.; Marimira, K. T.; Mulabisana, T. F.; Ntibinyane, O.; Pule, T.; Rathod, G. W.; Sitali, M.; Tabane, L.; van Aswegen, G.; Zulu, B. S.
2018-07-01
An earthquake of magnitude Mw 6.5 occurred on the evening of 3 April 2017 in Central Botswana, southern Africa. The event was well recorded by the regional seismic networks. The location by the Council for Geoscience (CGS) placed it near the Central Kgalagadi Game Reserve. Its effects were felt widely in southern Africa and were pronounced for residents of Gauteng and the North West Province in South Africa. In response to this event, the CGS, together with the Botswana Geoscience Institute (BGI), embarked on two scientific projects. The first project involved a macroseismic survey to study the extent and nature of the effects of the event in southern Africa. This involved CGS and BGI scientists soliciting information from members of the public through questionnaire surveys. More information was collected through questionnaires submitted online by the public. In total, 181 questionnaires were obtained through interviews and 151 online from South Africa, Zimbabwe and Namibia through collaboration between the CGS, the Meteorological Services Department of Zimbabwe and the Geological Survey of Namibia. All collected data were analysed to produce 79 intensity data points (IDPs) located all over the region, with maximum intensity values of VI (according to the Modified Mercalli Intensity scale) observed near the epicentre. This is quite a low value of maximum intensity for such a large event, but was expected given that the epicentral region is in a national park which is sparsely populated. The second scientific project involved the rapid installation of a temporary network of six seismograph stations in and around the location of the main event with the purpose of detecting and recording its aftershocks over a period of three months. Data recorded in the first month of April 2017 were collected and delivered to both the CGS and BGI for processing. More than 500 aftershock events of magnitude ML ≥ 0.8 were recorded and analysed for this period. All the events were located at the eastern edge of the Central Kgalagadi Park near the location of the main event in two clear clusters. The observed clusters imply that a segmented fault is the source of these earthquakes and is oriented in a NW-SE direction, similar to the direction inferred from the fault plane solution of the main event.
NASA Astrophysics Data System (ADS)
Klimasewski, A.; Sahakian, V. J.; Baltay, A.; Boatwright, J.; Fletcher, J. B.; Baker, L. M.
2017-12-01
A large source of epistemic uncertainty in Ground Motion Prediction Equations (GMPEs) is derived from the path term, currently represented as a simple geometric spreading and intrinsic attenuation term. Including additional physical relationships between the path properties and predicted ground motions would produce more accurate and precise, region-specific GMPEs by reclassifying some of the random, aleatory uncertainty as epistemic. This study focuses on regions of Southern California, using data from the Anza network and Southern California Seismic network to create a catalog of events magnitude 2.5 and larger from 1998 to 2016. The catalog encompasses regions of varying geology and therefore varying path and site attenuation. Within this catalog of events, we investigate several collections of event region-to-station pairs, each of which share similar origin locations and stations so that all events have similar paths. Compared with a simple regional GMPE, these paths consistently have high or low residuals. By working with events that have the same path, we can isolate source and site effects, and focus on the remaining residual as path effects. We decompose the recordings into source and site spectra for each unique event and site in our greater Southern California regional database using the inversion method of Andrews (1986). This model represents each natural log record spectra as the sum of its natural log event and site spectra, while constraining each record to a reference site or Brune source spectrum. We estimate a regional, path-specific anelastic attenuation (Q) and site attenuation (t*) from the inversion site spectra and corner frequency from the inversion event spectra. We then compute the residuals between the observed record data, and the inversion model prediction (event*site spectra). This residual is representative of path effects, likely anelastic attenuation along the path that varies from the regional median attenuation. We examine the residuals for our different sets independently to see how path terms differ between event-to-station collections. The path-specific information gained from this can inform development of terms for regional GMPEs, through understanding of these seismological phenomena.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohay, Alan C.; Clayton, Ray E.; Sweeney, Mark D.
The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. During FY 2010, the Hanford Seismic Network recorded 873 triggers on the seismometer system, which included 259 seismic events in the southeast Washington area and an additional 324 regional and teleseismic events. There were 210 events determined to be local earthquakes relevant to the Hanford Site. One hundred and fifty-five earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this fiscal year were a continuation of the swarm events observed during fiscal year 2009 and reported in previous quarterly and annual reports (Rohay et al. 2009a, 2009b, 2009c, 2010a, 2010b, and 2010c). Most events were considered minor (coda-length magnitude [Mc] less than 1.0) with the largest event recorded on February 4, 2010 (3.0Mc). The estimated depths of the Wooded Island events are shallow (averaging approximately 1.5 km deep) placing the swarm within the Columbia River Basalt Group. Based upon the last two quarters (Q3 and Q4) data, activity at the Wooded Island area swarm has largely subsided. Pacific Northwest National Laboratory will continue to monitor for activity at this location. The highest-magnitude events (3.0Mc) were recorded on February 4, 2010 within the Wooded Island swarm (depth 2.4 km) and May 8, 2010 on or near the Saddle Mountain anticline (depth 3.0 km). This latter event is not considered unusual in that earthquakes have been previously recorded at this location, for example, in October 2006 (Rohay et al. 2007). With regard to the depth distribution, 173 earthquakes were located at shallow depths (less than 4 km, most likely in the Columbia River basalts), 18 earthquakes were located at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and 19 earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, 178 earthquakes were located in known swarm areas, 4 earthquakes occurred on or near a geologic structure (Saddle Mountain anticline), and 28 earthquakes were classified as random events. The Hanford Strong Motion Accelerometer (SMA) network was triggered several times by the Wooded Island swarm events and the events located on or near the Saddle Mountain anticline. The maximum acceleration value recorded by the SMA network during fiscal year 2010 occurred February 4, 2010 (Wooded Island swarm event), approximately 2 times lower than the reportable action level for Hanford facilities (2% g) with no action required.« less
Use of court records for supplementing occupational disease surveillance.
Schwartz, E; Landrigan, P
1987-01-01
To conduct surveillance of occupationally related health events, the New Hampshire Division of Public Health Services analyzes death certificates and workers' compensation claims. In an effort to bolster these limited data sources, a previously unrecognized data-set comprised of court records was explored. Court records obtained from the Federal District Court proved to be a readily accessible and detailed source of information for identifying suspected cases of asbestos-related disease and potential sources of asbestos exposure. PMID:2959164
Thorn, Joanna C; Turner, Emma L; Hounsome, Luke; Walsh, Eleanor; Down, Liz; Verne, Julia; Donovan, Jenny L; Neal, David E; Hamdy, Freddie C; Martin, Richard M; Noble, Sian M
2016-04-29
To evaluate the accuracy of routine data for costing inpatient resource use in a large clinical trial and to investigate costing methodologies. Final-year inpatient cost profiles were derived using (1) data extracted from medical records mapped to the National Health Service (NHS) reference costs via service codes and (2) Hospital Episode Statistics (HES) data using NHS reference costs. Trust finance departments were consulted to obtain costs for comparison purposes. 7 UK secondary care centres. A subsample of 292 men identified as having died at least a year after being diagnosed with prostate cancer in Cluster randomised triAl of PSA testing for Prostate cancer (CAP), a long-running trial to evaluate the effectiveness and cost-effectiveness of prostate-specific antigen (PSA) testing. Both inpatient cost profiles showed a rise in costs in the months leading up to death, and were broadly similar. The difference in mean inpatient costs was £899, with HES data yielding ∼8% lower costs than medical record data (differences compatible with chance, p=0.3). Events were missing from both data sets. 11 men (3.8%) had events identified in HES that were all missing from medical record review, while 7 men (2.4%) had events identified in medical record review that were all missing from HES. The response from finance departments to requests for cost data was poor: only 3 of 7 departments returned adequate data sets within 6 months. Using HES routine data coupled with NHS reference costs resulted in mean annual inpatient costs that were very similar to those derived via medical record review; therefore, routinely available data can be used as the primary method of costing resource use in large clinical trials. Neither HES nor medical record review represent gold standards of data collection. Requesting cost data from finance departments is impractical for large clinical trials. ISRCTN92187251; Pre-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Hara, T.
2012-12-01
Hara (2007. EPS, 59, 227 - 231) developed a method to determine earthquake magnitudes using durations of high frequency energy radiation and displacement amplitudes of tele-seismic events, and showed that it was applicable to huge events such as the 2004 Sumatra earthquake (Mw 9.0 after the Global CMT catalog. In the following the moment magnitude are from their estimates). Since Hara (2007) developed this method, we have been applying it to large shallow events, and confirmed its effectiveness. The results for several events are available at the web site of our institute (http://iisee.kenken.go.jp/quakes.htm). Also, Hara (2011. EPS, 63, 525-528) applied this method to the 2011 Off the Pacific Coast of Tohoku Earthquake (Mw 9.1), and showed that it worked well. In these applications, we used only waveform data recorded in the tele-seismic distance range (30 - 85 degrees). In order to have a magnitude estimate faster, it is necessary to analyze regional distance range data. In this study, we applied the method of Hara (2007) to waveform data recorded in the regional distance range (8 - 30 degrees) to investigate its applicability. We slightly modified the method by changing durations of times series used for analysis considering arrivals of high amplitude Rayleigh waves. We selected the six recent huge (their moment magnitude are equal to or greater than 8.5) earthquakes; they are the December 26, 2004 Sumatra (Mw 9.0), the March 28, 2005 Northern Sumatra (Mw 8,6), the September 12, 2007 Southern Sumatra (Mw 8.5), the February 27, 2010 Chile (Mw 8.8), the March 11, 2011 off the Pacific Coast of Tohoku (Mw 9.1), the April 11, 2012 off West Coast of Northern Sumatra (Mw 8.6). We retrieved BHZ channel waveform data from IRIS DMC. For the 2004 Sumatra and 2010 Chile earthquakes, only a few waveform data are available. The estimated magnitudes are 9.16, 8.66, 8.53, 8.83, 9.15, and 8.70, respectively. Also, the estimated high frequency energy radiation durations are consistent with the centroid time shifts of the Global CMT catalog. These preliminary results suggest that the method of Hara (2007) is applicable to waveform data recorded in the regional distance range. We plan to apply this method to smaller events to investigate a possible systematic deviation from analyses of tele-seismic records.
Casperson, Shanon L; Sieling, Jared; Moon, Jon; Johnson, LuAnn; Roemmich, James N; Whigham, Leah
2015-03-13
Mobile technologies are emerging as valuable tools to collect and assess dietary intake. Adolescents readily accept and adopt new technologies; thus, a food record app (FRapp) may be a useful tool to better understand adolescents' dietary intake and eating patterns. We sought to determine the amenability of adolescents, in a free-living environment with minimal parental input, to use the FRapp to record their dietary intake. Eighteen community-dwelling adolescents (11-14 years) received detailed instructions to record their dietary intake for 3-7 days using the FRapp. Participants were instructed to capture before and after images of all foods and beverages consumed and to include a fiducial marker in the image. Participants were also asked to provide text descriptors including amount and type of all foods and beverages consumed. Eight of 18 participants were able to follow all instructions: included pre- and post-meal images, a fiducial marker, and a text descriptor and collected diet records on 2 weekdays and 1 weekend day. Dietary intake was recorded on average for 3.2 (SD 1.3 days; 68% weekdays and 32% weekend days) with an average of 2.2 (SD 1.1) eating events per day per participant. A total of 143 eating events were recorded, of which 109 had at least one associated image and 34 were recorded with text only. Of the 109 eating events with images, 66 included all foods, beverages and a fiducial marker and 44 included both a pre- and post-meal image. Text was included with 78 of the captured images. Of the meals recorded, 36, 33, 35, and 39 were breakfasts, lunches, dinners, and snacks, respectively. These data suggest that mobile devices equipped with an app to record dietary intake will be used by adolescents in a free-living environment; however, a minority of participants followed all directions. User-friendly mobile food record apps may increase participant amenability, increasing our understanding of adolescent dietary intake and eating patterns. To improve data collection, the FRapp should deliver prompts for tasks, such as capturing images before and after each eating event, including the fiducial marker in the image, providing complete and accurate text information, and ensuring all eating events are recorded and should be customizable to individuals and to different situations. Clinicaltrials.gov NCT01803997. http://clinicaltrials.gov/ct2/show/NCT01803997 (Archived at: http://www.webcitation.org/6WiV1vxoR).
Merging of rain gauge and radar data for urban hydrological modelling
NASA Astrophysics Data System (ADS)
Berndt, Christian; Haberlandt, Uwe
2015-04-01
Urban hydrological processes are generally characterised by short response times and therefore rainfall data with a high resolution in space and time are required for their modelling. In many smaller towns, no recordings of rainfall data exist within the urban catchment. Precipitation radar helps to provide extensive rainfall data with a temporal resolution of five minutes, but the rainfall amounts can be highly biased and hence the data should not be used directly as a model input. However, scientists proposed several methods for adjusting radar data to station measurements. This work tries to evaluate rainfall inputs for a hydrological model regarding the following two different applications: Dimensioning of urban drainage systems and analysis of single event flow. The input data used for this analysis can be divided into two groups: Methods, which rely on station data only (Nearest Neighbour Interpolation, Ordinary Kriging), and methods, which incorporate station as well as radar information (Conditional Merging, Bias correction of radar data based on quantile mapping with rain gauge recordings). Additionally, rainfall intensities that were directly obtained from radar reflectivities are used. A model of the urban catchment of the city of Brunswick (Lower Saxony, Germany) is utilised for the evaluation. First results show that radar data cannot help with the dimensioning task of sewer systems since rainfall amounts of convective events are often overestimated. Gauges in catchment proximity can provide more reliable rainfall extremes. Whether radar data can be helpful to simulate single event flow depends strongly on the data quality and thus on the selected event. Ordinary Kriging is often not suitable for the interpolation of rainfall data in urban hydrology. This technique induces a strong smoothing of rainfall fields and therefore a severe underestimation of rainfall intensities for convective events.
Improved rapid magnitude estimation for a community-based, low-cost MEMS accelerometer network
Chung, Angela I.; Cochran, Elizabeth S.; Kaiser, Anna E.; Christensen, Carl M.; Yildirim, Battalgazi; Lawrence, Jesse F.
2015-01-01
Immediately following the Mw 7.2 Darfield, New Zealand, earthquake, over 180 Quake‐Catcher Network (QCN) low‐cost micro‐electro‐mechanical systems accelerometers were deployed in the Canterbury region. Using data recorded by this dense network from 2010 to 2013, we significantly improved the QCN rapid magnitude estimation relationship. The previous scaling relationship (Lawrence et al., 2014) did not accurately estimate the magnitudes of nearby (<35 km) events. The new scaling relationship estimates earthquake magnitudes within 1 magnitude unit of the GNS Science GeoNet earthquake catalog magnitudes for 99% of the events tested, within 0.5 magnitude units for 90% of the events, and within 0.25 magnitude units for 57% of the events. These magnitudes are reliably estimated within 3 s of the initial trigger recorded on at least seven stations. In this report, we present the methods used to calculate a new scaling relationship and demonstrate the accuracy of the revised magnitude estimates using a program that is able to retrospectively estimate event magnitudes using archived data.
NASA Astrophysics Data System (ADS)
Retrum, J. B.; Gonzalez, L. A.; Edwards, R.; Cheng, H.; Tincher, S. M.; Urbani, F.
2013-12-01
The dearth of studies and data in the tropics hinders our understanding of atmospheric and oceanic interactions between the low latitudes and the rest of the globe. To understand better the interactions, specifically between the Caribbean and the North Atlantic, three stalagmites were collected from Cueva Zarraga in the Falcón Mountains of northwestern Venezuela and analyzed to determine local paleoclimatic history. Stalagmites ages were determined by U/Th disequilibrium and show a nearly complete Holocene record. The stalagmites have an average temporal resolution of 10.8 years/mm and ranges from 2.1 to 62.7 years. Both the carbon and oxygen isotope records preserve quasi-millennial oscillations and show a major depletion shift from the last glacial period into the Holocene, suggesting warmer and wetter conditions during the Holocene. The preservation of quasi-millennial oscillations and of high frequency multi-decadal changes by the δ13C indicates that the soil-vegetation-stalagmite system is acting as an amplifier of the climatic signal produced by climatic events and changes. In the early Holocene, the δ18O record shows a depletion trend from ~ 11,000 to 8,000 cal yr BP before reaching the Holocene Thermal Maximum. A prominent δ18O enrichment event is recorded in all the stalagmites that correspond to the 8.2 ka event. The 8.2 ka event is represented by a double peak with duration of ~ 180 years. Other short-term δ18O enrichment events likely correspond to Bond events 1, 2, 5, and 6. The late Holocene record, like other Caribbean records, indicates that the climate system diverges from insolation and may represent an atmospheric rearrangement that resulted in ENSO increase instability or in reduced seasonal movement of the Inter-Tropical Convergence Zone (ITCZ). Today, Cueva Zarraga is at the northern extent of the ITCZ and has two rainy seasons. The δ18O enrichment events during the Holocene suggest drier conditions southern displacement of the ITCZ, also suggested by Brazilian speleothem records that show trends that anti-correlate with Cueva Zarraga. The Cariaco Basin and Cueva Zarraga records show similar trends. The close proximity of Cueva Zarraga to Cariaco Basin may allow for a high-resolution tropical terrestrial and oceanic climatic response comparison.
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
Attentional Mechanisms in Sports via Brain-Electrical Event-Related Potentials
ERIC Educational Resources Information Center
Hack, Johannes; Memmert, Daniel; Rup, Andre
2009-01-01
In this study, we examined attention processes in complex, sport-specific decision-making tasks without interdependencies from anticipation. Psychophysiological and performance data recorded from advanced and intermediate level basketball referees were compared. Event-related potentials obtained while judging game situations in foul recognition…
New Perspectives on the Frequency and Importance of Tambora-like Events
NASA Astrophysics Data System (ADS)
Verosub, K. L.
2011-12-01
The 1815 Tambora eruption is generally accepted as having had a significant impact on global climate. What is not clear is whether any earlier volcanic eruptions of about the same Volcanic Explosivity Index had similar impacts. The tree-ring record suggests that the 1600 eruption of Huaynaputina volcano in Peru may have been one such event. Although the instrumental record for this eruption is minimal, historical sources provide a wealth of data about the climatic impacts of this eruption, Famines in Russia and Estonia, late harvests in central Europe, and the early onset of winter conditions for lakes in Japan and harbors on the Baltic Sea document that 1601 was indeed a particularly cold and fairly wet year. Extensive Spanish and Jesuit archives covering the Americas and parts of Asia, plus records of the imperial court in China and the shogunate in Japan should make it possible to obtain a global record of the human and social impacts of the 1600 eruption. The historical record can also be used to determine whether volcanic eruptions produced anomalously cold conditions in 1258 and 1453. Taken together, these four events imply that the return period for Tambora-like events is actually on the order of 200 years, a figure that is in agreement with estimates from ice core records. Since it has been 196 years since the last such event, it is worth considering what the impacts would be if another event were to occur within the next ten years. In particular, the current global agricultural economy may be less resilient to a very cold year than more regionally-based agriculture was in 1816.
NASA Astrophysics Data System (ADS)
Dix, Katherine; Greenhalgh, Stewart
2014-05-01
Macroseismic data in the form of felt reports of earthquake shaking is vital to seismic hazard assessment, especially in view of the relatively short period of instrumental recording in many countries. During the early 1990s, we conducted a very detailed examination of historical earthquake records held in the State Government archives and the Public Library (newspaper accounts) of South Australia. This work resulted in the compilation of a list of just over 460 earthquakes in the period prior to seismic network recording, which commenced in 1963. A single Milne (and later Milne-Shaw) seismograph had been operated in Adelaide from 1908 to 1948 to record worldwide events but it was not suitable for studying local seismic activity. The majority of the historical events uncovered had escaped mention in any previous publications on South Australian seismicity and seismic risk. This historical earthquake research, including the production of a large number of isoseismal maps to enable quantification in terms of magnitude and location, appears to have been the only study of its kind in South Australia performed so comprehensively, and resulted in the most extensive list available. After 20 years, it still stands as the definitive list of historical earthquake events in the State. The incorporation of these additional historical events into the South Australian Earthquake Catalogue maintained by the SA Department of Primary Industries and Resources had the potential to raise the previous listing of just 49 pre-instrumental events to 511 earthquakes, and to extend the record back another 46 years to 1837, the date the colony of South Australia was proclaimed. Some of the major events have been formally included in the South Australian Earthquake Catalogue. However, for many events there was insufficient information and/or time to finalize the source parameters due to the onerous task of manually trawling through historical records and newspapers for felt reports. With the advent of the information age, researching historical newspapers and records is now a feasible undertaking, although such accounts are biased by the population distribution and the history of newspaper operations in the emerging colony. To provide an example of what is possible, we recovered reports of an additional 110 previously unrecognized earthquakes during the first 50 years since European settlement of South Australia, from digitized SA newspapers recently made available on the National Library of Australia's website called TROVE. This was done in a relatively short period of time and now the South Australian Historical Earthquake List incorporating these events comprises some 679 entries. This research builds upon and consolidates the work that was commenced 20 years ago. By doing so, it proposes the establishment of flexible and convenient computerized processes to maintain well into the future an increasingly accurate and complete record of historical earthquakes in South Australia. This work may also provide a model for the ongoing development of historical earthquake records in other states and territories of Australia.
NASA Astrophysics Data System (ADS)
Ziegler, A.; Balch, R. S.; van Wijk, J.
2015-12-01
Farnsworth Oil Field in North Texas hosts an ongoing carbon capture, utilization, and storage project. This study is focused on passive seismic monitoring at the carbon injection site to measure, locate, and catalog any induced seismic events. A Geometrics Geode system is being utilized for continuous recording of the passive seismic downhole bore array in a monitoring well. The array consists of 3-component dual Geospace OMNI-2400 15Hz geophones with a vertical spacing of 30.5m. Downhole temperature and pressure are also monitored. Seismic data is recorded continuously and is produced at a rate of over 900GB per month, which must be archived and reviewed. A Short Term Average/Long Term Average (STA/LTA) algorithm was evaluated for its ability to search for events, including identification and quantification of any false positive events. It was determined that the algorithm was not appropriate for event detection with the background level of noise at the field site and for the recording equipment as configured. Alternatives are being investigated. The final intended outcome of the passive seismic monitoring is to mine the continuous database and develop a catalog of microseismic events/locations and to determine if there is any relationship to CO2 injection in the field. Identifying the location of any microseismic events will allow for correlation with carbon injection locations and previously characterized geological and structural features such as faults and paleoslopes. Additionally, the borehole array has recorded over 1200 active sources with three sweeps at each source location that were acquired during a nearby 3D VSP. These data were evaluated for their usability and location within an effective radius of the array and were stacked to improve signal-noise ratio and are used to calibrate a full field velocity model to enhance event location accuracy. Funding for this project is provided by the U.S. Department of Energy under Award No. DE-FC26-05NT42591.
A New Patient Record System Using the Laser Card
Brown, J.H.U.; Vallbona, Carlos
1988-01-01
A method of handling medical data in the form of patient records including physical findings such as x-rays has been devised using a laser card coupled to a p.c. for data input and output. A satisfactory software system which encompasses a formalized medical record system dealing with events rather than chronological order of entry has been devised and is now under test in a community health clinic. Future directions of the card research are discussed and expanded upon. ImagesFig. 7
An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data
Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800
NASA Astrophysics Data System (ADS)
Kuzma, H. A.; Arehart, E.; Louie, J. N.; Witzleben, J. L.
2012-04-01
Listening to the waveforms generated by earthquakes is not new. The recordings of seismometers have been sped up and played to generations of introductory seismology students, published on educational websites and even included in the occasional symphony. The modern twist on earthquakes as music is an interest in using state-of-the-art computer algorithms for seismic data processing and evaluation. Algorithms such as such as Hidden Markov Models, Bayesian Network models and Support Vector Machines have been highly developed for applications in speech recognition, and might also be adapted for automatic seismic data analysis. Over the last three years, the International Data Centre (IDC) of the Comprehensive Test Ban Treaty Organization (CTBTO) has supported an effort to apply computer learning and data mining algorithms to IDC data processing, particularly to the problem of weeding through automatically generated event bulletins to find events which are non-physical and would otherwise have to be eliminated by the hand of highly trained human analysts. Analysts are able to evaluate events, distinguish between phases, pick new phases and build new events by looking at waveforms displayed on a computer screen. Human ears, however, are much better suited to waveform processing than are the eyes. Our hypothesis is that combining an auditory representation of seismic events with visual waveforms would reduce the time it takes to train an analyst and the time they need to evaluate an event. Since it takes almost two years for a person of extraordinary diligence to become a professional analyst and IDC contracts are limited to seven years by Treaty, faster training would significantly improve IDC operations. Furthermore, once a person learns to distinguish between true and false events by ear, various forms of audio compression can be applied to the data. The compression scheme which yields the smallest data set in which relevant signals can still be heard is likely an excellent candidate from which to draw features that can be fed into machine learning algorithms since it contains a compact numerical representation of the information that humans need to evaluate events. The challenge in this work is that, although it is relatively easy to pick out earthquake arrivals in waveform data from a single station, when stations are combined the addition of background noise tends to confuse and overwhelm the listener. To solve this problem, we rely on techniques such as the slowing down of recordings without altering the pitch which are used by ethnomusicologists to understand highly complex rhythms and sounds. We work with professional musicians and recorders to mix the data from different seismic stations in a way which reduces noise and preserves the uniqueness of each station.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiogama, Hideo; Imada, Yukiko; Mori, Masato
Here, we describe two unprecedented large (100-member), longterm (61-year) ensembles based on MRI-AGCM3.2, which were driven by historical and non-warming climate forcing. These ensembles comprise the "Database for Policy Decision making for Future climate change (d4PDF)". We compare these ensembles to large ensembles based on another climate model, as well as to observed data, to investigate the influence of anthropogenic activities on historical changes in the numbers of record-breaking events, including: the annual coldest daily minimum temperature (TNn), the annual warmest daily maximum temperature (TXx) and the annual most intense daily precipitation event (Rx1day). These two climate model ensembles indicatemore » that human activity has already had statistically significant impacts on the number of record-breaking extreme events worldwide mainly in the Northern Hemisphere land. Specifically, human activities have altered the likelihood that a wider area globally would suffer record-breaking TNn, TXx and Rx1day events than that observed over the 2001- 2010 period by a factor of at least 0.6, 5.4 and 1.3, respectively. However, we also find that the estimated spatial patterns and amplitudes of anthropogenic impacts on the probabilities of record-breaking events are sensitive to the climate model and/or natural-world boundary conditions used in the attribution studies.« less
Crooks, Colin John; Card, Timothy Richard; West, Joe
2012-11-13
Primary care records from the UK have frequently been used to identify episodes of upper gastrointestinal bleeding in studies of drug toxicity because of their comprehensive population coverage and longitudinal recording of prescriptions and diagnoses. Recent linkage within England of primary and secondary care data has augmented this data but the timing and coding of concurrent events, and how the definition of events in linked data effects occurrence and 28 day mortality is not known. We used the recently linked English Hospital Episodes Statistics and General Practice Research Database, 1997-2010, to define events by; a specific upper gastrointestinal bleed code in either dataset, a specific bleed code in both datasets, or a less specific but plausible code from the linked dataset. This approach resulted in 81% of secondary care defined bleeds having a corresponding plausible code within 2 months in primary care. However only 62% of primary care defined bleeds had a corresponding plausible HES admission within 2 months. The more restrictive and specific case definitions excluded severe events and almost halved the 28 day case fatality when compared to broader and more sensitive definitions. Restrictive definitions of gastrointestinal bleeding in linked datasets fail to capture the full heterogeneity in coding possible following complex clinical events. Conversely too broad a definition in primary care introduces events not severe enough to warrant hospital admission. Ignoring these issues may unwittingly introduce selection bias into a study's results.
PS1-41: Just Add Data: Implementing an Event-Based Data Model for Clinical Trial Tracking
Fuller, Sharon; Carrell, David; Pardee, Roy
2012-01-01
Background/Aims Clinical research trials often have similar fundamental tracking needs, despite being quite variable in their specific logic and activities. A model tracking database that can be quickly adapted by a variety of studies has the potential to achieve significant efficiencies in database development and maintenance. Methods Over the course of several different clinical trials, we have developed a database model that is highly adaptable to a variety of projects. Rather than hard-coding each specific event that might occur in a trial, along with its logical consequences, this model considers each event and its parameters to be a data record in its own right. Each event may have related variables (metadata) describing its prerequisites, subsequent events due, associated mailings, or events that it overrides. The metadata for each event is stored in the same record with the event name. When changes are made to the study protocol, no structural changes to the database are needed. One has only to add or edit events and their metadata. Changes in the event metadata automatically determine any related logic changes. In addition to streamlining application code, this model simplifies communication between the programmer and other team members. Database requirements can be phrased as changes to the underlying data, rather than to the application code. The project team can review a single report of events and metadata and easily see where changes might be needed. In addition to benefitting from streamlined code, the front end database application can also implement useful standard features such as automated mail merges and to do lists. Results The event-based data model has proven itself to be robust, adaptable and user-friendly in a variety of study contexts. We have chosen to implement it as a SQL Server back end and distributed Access front end. Interested readers may request a copy of the Access front end and scripts for creating the back end database. Discussion An event-based database with a consistent, robust set of features has the potential to significantly reduce development time and maintenance expense for clinical trial tracking databases.
Tales from the South (and West) Pacific in the Common Era: A Climate Proxy Perspective (Invited)
NASA Astrophysics Data System (ADS)
Quinn, T. M.; Taylor, F. W.; Partin, J. W.; Maupin, C. R.; Hereid, K. A.; Gorman, M. K.
2010-12-01
The southwest Pacific is a major source of tropical climate variability through heat and moisture exchanges associated with the Western Pacific Warm Pool (WPWP) and the South Pacific Convergence Zone (SPCZ). These variations are especially significant at the annual, interannual (El Niño-Southern Oscillation, ENSO), and multi-decadal timescales. Gridded SST data products are available in the pre-satellite era in this region for the past ~130 years, although data density is a significant issue for the older half of these records. Time series of salinity (SSS) and rainfall from this region are exceedingly rare. Thus, climate proxy records must be used to reconstruct SST, SSS, and rainfall variations in the Common Era (CE) in the tropical Pacific. The analytical laboratory for paleoclimate studies at UT has focused its research efforts into producing climate proxy time series from southwest tropical Pacific using modern and fossil corals, and speleothems. Our most recent results are summarized in this presentation, although much of this work is still in progress. Coral climate records have been generated from Sabine Bank, Vanuatu (16°S, 166°E) and Misima Island, Papua New Guinea (10.6°S, 152.8°E). The Vanuatu coral record of monthly resolved Sr/Ca variations extends back to the late 18th century. All strong ENSO warm phase events of the 20th century observed in the instrumental record are also observed in the coral record. We note that several ENSO warm phase events in the 19th century portion of the coral record are comparable in size to that recorded in response to the 1982/1983 and 1997/1998 events. The Misima coral record of monthly resolved δ18O and Sr/Ca variations spans the interval ~1414-1645 CE — the heart of the Little Ice Age. Amplitude modulation of interannual variability is observed in this LIA record, much like what is observed during the relatively quiescent period of 1920-1950 in the 20th century instrumental and proxy records of ENSO. However, the amplitude of individual ENSO warm phase events in the LIA record is reduced, relative to that of the 1941/1942 ENSO warm phase events observed in a near modern coral record from Misima. Speleothem climate records have been generated from Espirito Santo, Vanuatu (15.5°S, 167°E) and Guadalcanal, Solomon Islands (~9°S, 160°E). The Vanuatu record of δ18O variations is from a fast-growing speleothem (~1-3 mm/year), which yields a record of rainfall variability spanning ~1670-2005 CE, as dated by U-Th disequilibrium techniques. Interannual changes in speleothem δ18O appear to capture ENSO events and subsequent reorganizations of the SPCZ. The Vanuatu speleothem δ18O record also exhibits concentrations of variance on the decadal scale. The Guadalcanal record of δ18O variations is also from a fast-growing speleothem (~1-4 mm/year), which yields a record of rainfall variability spanning ~1650-2010 CE, as dated by U-Th disequilibrium techniques. The δ18O records from both of these stalagmites provide evidence for changes in convection in the equatorial WPWP region of the SPCZ: the rising limb of the Pacific Walker Circulation.
77 FR 47552 - Event Data Recorders
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-09
... uncertainties in multiple event crashes; Revised certain sensor ranges and accuracies to reflect current state... resolution specification of 5 degrees. In its petition the Alliance stated that steering wheel angle sensors... angle sensors. Both Nissan and GAM submitted comments in support of the Alliance and Honda petitions to...
Precise dating of Dansgaard-Oeschger climate oscillations in western Europe from stalagmite data.
Genty, D; Blamart, D; Ouahdi, R; Gilmour, M; Baker, A; Jouzel, J; Van-Exter, Sandra
2003-02-20
The signature of Dansgaard-Oeschger events--millennial-scale abrupt climate oscillations during the last glacial period--is well established in ice cores and marine records. But the effects of such events in continental settings are not as clear, and their absolute chronology is uncertain beyond the limit of (14)C dating and annual layer counting for marine records and ice cores, respectively. Here we present carbon and oxygen isotope records from a stalagmite collected in southwest France which have been precisely dated using 234U/230Th ratios. We find rapid climate oscillations coincident with the established Dansgaard-Oeschger events between 83,000 and 32,000 years ago in both isotope records. The oxygen isotope signature is similar to a record from Soreq cave, Israel, and deep-sea records, indicating the large spatial scale of the climate oscillations. The signal in the carbon isotopes gives evidence of drastic and rapid vegetation changes in western Europe, an important site in human cultural evolution. We also find evidence for a long phase of extremely cold climate in southwest France between 61.2 +/- 0.6 and 67.4 +/- 0.9 kyr ago.
Enriching Great Britain's National Landslide Database by searching newspaper archives
NASA Astrophysics Data System (ADS)
Taylor, Faith E.; Malamud, Bruce D.; Freeborough, Katy; Demeritt, David
2015-11-01
Our understanding of where landslide hazard and impact will be greatest is largely based on our knowledge of past events. Here, we present a method to supplement existing records of landslides in Great Britain by searching an electronic archive of regional newspapers. In Great Britain, the British Geological Survey (BGS) is responsible for updating and maintaining records of landslide events and their impacts in the National Landslide Database (NLD). The NLD contains records of more than 16,500 landslide events in Great Britain. Data sources for the NLD include field surveys, academic articles, grey literature, news, public reports and, since 2012, social media. We aim to supplement the richness of the NLD by (i) identifying additional landslide events, (ii) acting as an additional source of confirmation of events existing in the NLD and (iii) adding more detail to existing database entries. This is done by systematically searching the Nexis UK digital archive of 568 regional newspapers published in the UK. In this paper, we construct a robust Boolean search criterion by experimenting with landslide terminology for four training periods. We then apply this search to all articles published in 2006 and 2012. This resulted in the addition of 111 records of landslide events to the NLD over the 2 years investigated (2006 and 2012). We also find that we were able to obtain information about landslide impact for 60-90% of landslide events identified from newspaper articles. Spatial and temporal patterns of additional landslides identified from newspaper articles are broadly in line with those existing in the NLD, confirming that the NLD is a representative sample of landsliding in Great Britain. This method could now be applied to more time periods and/or other hazards to add richness to databases and thus improve our ability to forecast future events based on records of past events.
NASA Astrophysics Data System (ADS)
Mencin, D.; Hodgkinson, K. M.; Mattioli, G. S.; Johnson, W.; Gottlieb, M. H.; Meertens, C. M.
2016-12-01
Three-component strainmeter data from numerous borehole strainmeters (BSM) along the San Andreas Fault (SAF), including those that were installed and maintained as part of the EarthScope Plate Boundary Observatory (PBO), demonstrate that the characteristics of creep propagation events with sub-cm slip amplitudes can be quantified for slip events at 10 km source-to-sensor distances. The strainmeters are installed at depths of approximately 100 - 250 m and record data at a rate of 100 samples per second. Noise levels at periods of less than a few minutes are 10-11 strain, and for periods in the bandwidth hours to weeks, the periods of interest in the search for slow slip events, are of the order of 10-8 to 10-10 strain. Strainmeters, creepmeters, and tiltmeters have been operated along the San Andreas Fault, observing creep events for decades. BSM data proximal to the SAF cover a significant temporal portion of the inferred earthquake cycle along this portion of the fault. A single instrument is capable of providing broad scale constraints of creep event asperity size, location, and depth and moreover can capture slow slip, coseismic rupture as well as afterslip. The synthesis of these BSM data presents a unique opportunity to constrain the partitioning between aseismic and seismic slip on the central SAF. We show that the creepmeters confirm that creep events that are imaged by the strainmeters, previously catalogued by the authors, are indeed occurring on the SAF, and are simultaneously being recorded on local creepmeters. We further show that simple models allow us to loosely constrain the location and depth of the creep event on the fault, even with a single instrument, and to image the accumulation and behavior of surface as well as crustal creep with time.
The features of radiation dose variations onboard ISS and Mir space station: comparative study.
Tverskaya, L V; Panasyuk, M I; Reizman, S Ya; Sosnovets, E N; Teltsov, M V; Tsetlin, V V
2004-01-01
The dynamics of the ISS-measured radiation dose variations since August 2000 is studied. Use is made of the data obtained with the R-16 instrument, which consists of two ionization chambers behind different shielding thicknesses. The doses recorded during solar energetic particle (SEP) events are compared with the data obtained also by R-16 on Mir space station. The SEP events in the solar maximum of the current cycle make a much smaller contribution to the radiation dose compared with the October 1989 event recorded on Mir space station. In the latter event, the proton intensity was peaking during a strong magnetic storm. The storm-time effect of solar proton geomagnetic cutoff decreases on dose variations is estimated. The dose variations on Mir space stations due to formation of a new radiation belt of high-energy protons and electrons during a sudden commencement of March 24, 1991 storm are also studied. It was for the first time throughout the ISS and Mir dose measurement period that the counting rates recorded by both R-16 channels on ISS in 2001-2002 were nearly the same during some time intervals. This effect may arise from the decreases of relativistic electron fluxes in the outer radiation belt. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.
Influenza Seasonal Summary: Departments of the Navy and Defense 2015-2016
2016-08-01
System (CHCS) (laboratory, pharmacy, and radiology data), inpatient admission records, ambulatory medical encounter records, and vaccination records...had a medical event report (MER). Active Duty and Recruits Influenza activity among AD and recruit Sailors and Marines were similar to overall DON...considered immune). Geographic Distribution Influenza activity among DON beneficiaries was greatest at Naval Medical Center (NMC) San Diego; other
Rapid Offline-Online Post-Disaster Landslide Mapping Tool: A case study from Nepal
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya
2016-04-01
One of the crucial components of post disaster management is the efficient mapping of impacted areas. Here we present a tool designed to map landslides and affected objects after the earthquakes of 2015 in Nepal as well as for intense rainfall impact. Because internet is not available in many rural areas of Nepal, we developed an offline-online prototype based on Open-Source WebGIS technologies to make data on hazard impacts, including damaged infrastructure, landslides or flooding events available to authorities and the general public. This mobile application was designed as a low-cost, rapid and participatory method for recording impacts from hazard events. It is possible to record such events offline and upload them through a server, where internet connection is available. This application allows user authentication, image capturing, and information collation such as geolocation, event description, interactive mapping and finally storing all the data in the server for further analysis and visualisation. This application can be accessed by a mobile phone (Android) or a tablet as a hybrid version for both offline and online versions. The offline version has an interactive-offline map function which allows users to upload satellites image in order to improve ground truthing interpretation. After geolocation, the user can start mapping and then save recorded data into Geojson-TXT files that can be easily uploaded to the server whenever internet is available. This prototype was tested specifically for a rapid assessment of landslides and relevant land use characteristics such as roads, forest area, rivers in the Phewa Lake watershed near Pokhara, Nepal where a large number landslides were activated or reactivated after the 2015 monsoon season. More than 60 landslides were recorded during two days of field trip. Besides, it is possible to use this application for any other kind of hazard event like flood, avalanche, etc. Keywords: Offline, Online, Open source, WebGIS, Android, Post-Disaster, Landslide mapping
NASA Astrophysics Data System (ADS)
Roland, E. C.; Walton, M. A. L.; Ruppert, N. A.; Gulick, S. P. S.; Christeson, G. L.; Haeussler, P. J.
2014-12-01
In January 2013, a Mw 7.5 earthquake ruptured a segment of the Queen Charlotte Fault offshore the town of Craig in southeast Alaska. The region of the fault that slipped during the Craig earthquake is adjacent to and possibly overlapping with the northern extent of the 1949 M 8.1 Queen Charlotte earthquake rupture (Canada's largest recorded earthquake), and is just south of the rupture area of the 1972 M 7.6 earthquake near Sitka, Alaska. Here we present aftershock locations and focal mechanisms for events that occurred four months following the mainshock using data recorded on an Ocean Bottom Seismometer (OBS) array that was deployed offshore of Prince of Wales Island. This array consisted of 9 short period instruments surrounding the fault segment, and recorded hundreds of aftershocks during the months of April and May, 2013. In addition to highlighting the primary mainshock rupture plane, aftershocks also appear to be occurring along secondary fault structures adjacent to the main fault trace, illuminating complicated structure, particularly toward the northern extent of the Craig rupture. Focal mechanisms for the larger events recorded during the OBS deployment show both near-vertical strike slip motion consistent with the mainshock mechanism, as well as events with varying strike and a component of normal faulting. Although fault structure along this northern segment of the QCF appears to be considerably simpler than to the south, where a higher degree of oblique convergence leads to sub-parallel compressional deformation structures, secondary faulting structures apparent in legacy seismic reflection data near the Craig rupture may be consistent with the observed seismicity patterns. In combination, these data may help to characterize structural heterogeneity along the northern segment of the Queen Charlotte Fault that contributes to rupture segmentation during large strike slip events.
Self-Consistency of Rain Event Definitions
NASA Astrophysics Data System (ADS)
Teves, J. B.; Larsen, M.
2014-12-01
A dense optical rain disdrometer array was constructed to study rain variability on spatial scales less than 100 meters with temporal resolution of 1 minute. Approximately two months of data were classified into rain events using methods common in the literature. These methods were unable to identify an array-wide consensus as to the total number of rain events; instruments as little as 2 meters apart with similar data records sometimes identified different rain event totals. Physical considerations suggest that these differing event totals are likely due to instrument sampling fluctuations that are typically not accounted for in rain event studies. Detection of varying numbers of rain events impact many commonly used storm statistics including storm duration distributions and mean rain rate. A summary of the results above and their implications are presented.
NASA Astrophysics Data System (ADS)
Kiefer, J.; Karamperidou, C.
2017-12-01
Clastic sediment flux into high-elevation Andean lakes is controlled by glacial processes and soil erosion caused by high precipitation events, making these lakes suitable archives of past climate. To wit, sediment records from Laguna Pallcacocha in Ecuador have been interpreted as proxies of ENSO variability, owing to increased precipitation in the greater region during El Niño events. However, the location of the lake's watershed, the presence of glaciers, and the different impacts of ENSO on precipitation in the eastern vs western Andes have challenged the suitability of the Pallcacocha record as an ENSO proxy. Here, we employ WRF, a high-resolution regional mesoscale weather prediction model, to investigate the circulation dynamics, sources of moisture, and resulting precipitation response in the L. Pallcacocha region during different flavors of El Niño and La Niña events, and in the presence or absence of ice caps. In patricular, we investigate Eastern Pacific (EP), Central Pacific (CP), coastal El Niño, and La Niña events. We validate the model simulations against spatially interpolated station measurements and reanalysis data. We find that during EP events, moisture is primarily advected from the Pacific, whereas during CP events, moisture primarily originates from the Atlantic. More moisture is available during EP events, which implies higher precipitation rates. Furthermore, we find that precipitation during EP events is mostly non-convective in contrast to primarily convective precipitation during CP events. Finally, a synthesis of the sedimentary record and the EP:CP ratio of accumulated precipitation and specific humidity in the L. Pallcacocha region allows us to assess whether past changes in the relative frequency of the two ENSO flavors may have been recorded in paleoclimate archives in this region.
Methods for automatic trigger threshold adjustment
Welch, Benjamin J; Partridge, Michael E
2014-03-18
Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.
Contribution of Infrasound to IDC Reviewed Event Bulletin
NASA Astrophysics Data System (ADS)
Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif Mohamed; Medinskaya, Tatiana; Mialle, Pierrick
2016-04-01
Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003 but infrasound detections could not be used by the Global Association (GA) Software due to the unmanageable high number of spurious associations. Offline improvements of the automatic processing took place to reduce the number of false detections to a reasonable level. In February 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts and large earthquakes belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. Presence of infrasound detection associated to an event from a mining area indicates a surface explosion. Satellite imaging and a database of active mines can be used to confirm the origin of such events. This presentation will summarize the contribution of 6 years of infrasound data to IDC bulletins and provide examples of events recorded at the IMS infrasound network. Results of this study may help to improve location of small events with observations on infrasound stations.
NASA Astrophysics Data System (ADS)
Caley, Thibaut; Malaizé, Bruno; Bassinot, Franck; Clemens, Steven C.; Caillon, Nicolas; Linda, Rossignol; Charlier, Karine; Rebaubier, Helene
2011-09-01
Previous studies have suggested that Marine Isotope Stage (MIS) 13, recognized as atypical in many paleoclimate records, is marked by the development of anomalously strong summer monsoons in the northern tropical areas. To test this hypothesis, we performed a multi-proxy study on three marine records from the tropical Indian Ocean in order to reconstruct and analyse changes in the summer Indian monsoon winds and precipitations during MIS 13. Our data confirm the existence of a low-salinity event during MIS 13 in the equatorial Indian Ocean but we argue that this event should not be considered as "atypical". Taking only into account a smaller precession does not make it possible to explain such precipitation episode. However, when considering also the larger obliquity in a more complete orbitally driven monsoon "model," one can successfully explain this event. In addition, our data suggest that intense summer monsoon winds, although not atypical in strength, prevailed during MIS 13 in the western Arabian Sea. These strong monsoon winds, transporting important moisture, together with the effect of insolation and Eurasian ice sheet, are likely one of the factors responsible for the intense monsoon precipitation signal recorded in China loess, as suggested by model simulations.
A system and method for online high-resolution mapping of gastric slow-wave activity.
Bull, Simon H; O'Grady, Gregory; Du, Peng; Cheng, Leo K
2014-11-01
High-resolution (HR) mapping employs multielectrode arrays to achieve spatially detailed analyses of propagating bioelectrical events. A major current limitation is that spatial analyses must currently be performed "off-line" (after experiments), compromising timely recording feedback and restricting experimental interventions. These problems motivated development of a system and method for "online" HR mapping. HR gastric recordings were acquired and streamed to a novel software client. Algorithms were devised to filter data, identify slow-wave events, eliminate corrupt channels, and cluster activation events. A graphical user interface animated data and plotted electrograms and maps. Results were compared against off-line methods. The online system analyzed 256-channel serosal recordings with no unexpected system terminations with a mean delay 18 s. Activation time marking sensitivity was 0.92; positive predictive value was 0.93. Abnormal slow-wave patterns including conduction blocks, ectopic pacemaking, and colliding wave fronts were reliably identified. Compared to traditional analysis methods, online mapping had comparable results with equivalent coverage of 90% of electrodes, average RMS errors of less than 1 s, and CC of activation maps of 0.99. Accurate slow-wave mapping was achieved in near real-time, enabling monitoring of recording quality and experimental interventions targeted to dysrhythmic onset. This work also advances the translation of HR mapping toward real-time clinical application.
RHIC Abort Kicker Prefire Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Y.; Perlstein, S.
2014-07-07
In an attempt to discover any pattern to prefire events, abort prefire kicker data from 2007 to the present day have been recorded. With the 2014 operations concluding, this comprises 8 years of prefire data. Any activities that the Pulsed Power Group did to decrease prefire occurrences were recorded as well, but some information may be missing. The following information is a compilation of the research to date.
NASA Technical Reports Server (NTRS)
Watson, Gregg W.
2000-01-01
The Sea-Viewing Wide Field-of-view Sensor (SeaWiFS) has observed 2.5 years of routine global chlorophyll observations from space. The mission was launched into a record El Nino event, which eventually gave way to one of the most intensive and longest-lasting La Nina events ever recorded. The SeaWiFS chlorophyll record captured the response of ocean phytoplankton to these significant events in the tropical Indo-Pacific basins, but also indicated significant interannual variability unrelated to the El Nino/La Nina events. This included large variability in the North Atlantic and Pacific basins, in the North Central and equatorial Atlantic, and milder patterns in the North Central Pacific. This SeaWiFS record was tracked with a coupled physical/biogeochemical/radiative model of the global oceans using near-real-time forcing data such as wind stresses, sea surface temperatures, and sea ice. This provided an opportunity to offer physically and biogeochemically meaningful explanations of the variability observed in the SeaWiFS data set, since the causal mechanisms and interrelationships of the model are completely understood. The coupled model was able to represent the seasonal distributions of chlorophyll during the SeaWiFS era, and was capable of differentiating among the widely different processes and dynamics occurring in the global oceans. The model was also reasonably successful in representing the interannual signal, especially when it was large, such as, the El Nino and La Nina events in the tropical Pacific and Indian Oceans. The model provided different phytoplankton group responses for the different events in these regions: diatoms were predominant in the tropical Pacific during the La Nina but other groups were predominant during El Nino. The opposite condition occurred in the tropical Indian Ocean. Both situations were due to the different responses of the basins to El Nino. The interannual variability in the North Atlantic, which was exhibited in SeaWiFS data as a decline in the spring/summer bloom in 1999 relative to 1998, resulted in the model from a more slowly shoaling mixed layer, allowing herbivore populations to keep pace with increasing phytoplankton populations. However, several aspects of the interannual cycle were not well-represented by the model. Explanations ranged from inherent model deficiencies, to monthly averaging of forcing fields, to biases in SeaWiFS atmospheric correction procedures.
The use of administrative and other records for the analysis of internal migration.
1983-01-01
There are 5 main types of administrative records that are of potential use in the analysis of internal migration in Africa: 1) population registers, 2) electoral rolls, 3) school records, 4) labor or employment records, and 5) social security records. The population register provides legal identification for the individual and records his movements from 1 civil subdivision to another. The process of establishing a population register is not a simple one. All 5 of these records are incomplete, defective, and in most cases decentralized; yet, in spite of these limitations, administrative records are potential sources of migration data. Because of their imcompleteness, major biases are likely to arise in their use. The 1st step is for National Statistical Services to assist in improving the coverage of events expected to be registered in any of these records. The 2nd step is to try to use the data through some form of ratio of regression estimation. If use is not made of the records for migration data, it is unlikely that the quality of the migration data in the records will ever improve.
How much do disasters cost? A comparison of disaster cost estimates in Australia
NASA Astrophysics Data System (ADS)
Ladds, Monique; Keating, Adriana; Handmer, John; Magee, Liam
2017-04-01
Extreme weather events in Australia are common and a large proportion of the population are exposed to such events. Therefore, there is great interest as to how these events will impact Australia's society and economy, which requires understanding the current and historical impact of disasters. Despite global efforts to record and cost disaster impacts, no standardised method of collecting and recording data retrospectively yet exists. The lack of comparability in turn produces quite different analyses of economic impacts. This paper examines five examples of aggregate cost and relative impacts of natural disasters in Australia, and comparisons between them reveal significant data shortcomings. The reliability of data sources, and the methodology employed to analyse them can have significant impacts on conclusions regarding the overall cost of disasters, the relative costs of different disaster types, and the distribution of costs across Australian states. We highlight difficulties with time series comparisons, further complicated by the interdependencies of the databases. We reiterate the need for consistent and comparable data collection and analysis, to respond to the increasing frequency and severity of disasters in Australia.
Meteorite falls in China and some related human casualty events
NASA Technical Reports Server (NTRS)
Yau, Kevin; Weissman, Paul; Yeomans, Donald
1994-01-01
Statistics of witnessed and recovered meteorite falls found in Chinese historical texts for the period from 700 B.C. to A.D. 1920 are presented. Several notable features can be seen in the binned distribution as a function of time. An apparent decrease in the number of meteorite reports in the 18th century is observed. An excess of observed meteorite falls in the period from 1840 to 1880 seems to correspond to a similar excess in European data. A chi sq probability test suggest that the association between the two data sets are real. Records of human casualities and structural damage resulting from meteorite falls are also given. A calculation based on the number of casualty events in the Chinese meteorite records suggests that the probability of a meteroite striking a human is far greater than previous estimates. However, it is difficult to verify the accuracy of the reported casualty events.
Neural network pattern recognition of lingual-palatal pressure for automated detection of swallow.
Hadley, Aaron J; Krival, Kate R; Ridgel, Angela L; Hahn, Elizabeth C; Tyler, Dustin J
2015-04-01
We describe a novel device and method for real-time measurement of lingual-palatal pressure and automatic identification of the oral transfer phase of deglutition. Clinical measurement of the oral transport phase of swallowing is a complicated process requiring either placement of obstructive sensors or sitting within a fluoroscope or articulograph for recording. Existing detection algorithms distinguish oral events with EMG, sound, and pressure signals from the head and neck, but are imprecise and frequently result in false detection. We placed seven pressure sensors on a molded mouthpiece fitting over the upper teeth and hard palate and recorded pressure during a variety of swallow and non-swallow activities. Pressure measures and swallow times from 12 healthy and 7 Parkinson's subjects provided training data for a time-delay artificial neural network to categorize the recordings as swallow or non-swallow events. User-specific neural networks properly categorized 96 % of swallow and non-swallow events, while a generalized population-trained network was able to properly categorize 93 % of swallow and non-swallow events across all recordings. Lingual-palatal pressure signals are sufficient to selectively and specifically recognize the initiation of swallowing in healthy and dysphagic patients.
Linking slope stability and climate change: the Nordfjord region, western Norway, case study
NASA Astrophysics Data System (ADS)
Vasskog, K.; Waldmann, N.; Ariztegui, D.; Simpson, G.; Støren, E.; Chapron, E.; Nesje, A.
2009-12-01
Valleys, lakes and fjords are spectacular features of the Norwegian landscape and their sedimentary record recall past climatic, environmental and glacio-isostatic changes since the late glacial. A high resolution multi-proxy study is being performed on three lakes in western Norway combining different geophysical methods and sediment coring with the aim of reconstructing paleoclimate and to investigate how the frequency of hazardous events in this area has changed through time. A very high resolution reflection seismic profiling revealed a series of mass-wasting deposits. These events, which have also been studied in radiocarbon-dated cores, suggest a changing impact of slope instability on lake sedimentation since the late glacial. A specially tailored physically-based mathematical model allowed a numerical simulation of one of these mass wasting events and related tsunami, which occurred during a devastating rock avalanche in 1936 killing 74 persons. The outcome has been further validated against historical, marine and terrestrial information, providing a model that can be applied to comparable basins at various temporal and geographical scales. Detailed sedimentological and geochemical studies of selected cores allows characterizing the sedimentary record and to disentangle each mass wasting event. This combination of seismic, sedimentary and geophysical data permits to extend the record of mass wasting events beyond historical times. The geophysical and coring data retrieved from these lakes is a unique trace of paleo-slope stability generated by isostatic rebound and climate change, thus providing a continuous archive of slope stability beyond the historical record. The results of this study provide valuable information about the impact of climate change on slope stability and source-to-sink processes.
IMPACTS OF CLIMATE-INDUCED CHANGES IN EXTREME EVENTS ON OZONE AND PARTICULATE MATTER AIR QUALITY
Historical data records of air pollution meteorology from multiple datasets will be compiled and analyzed to identify possible trends in extreme events. Changes in climate and air quality between 2010 and 2050 will be simulated with a suite of models. The consequential effe...
Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon
2016-01-01
This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.
2016-01-01
This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006–2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year. PMID:26812351
NASA Technical Reports Server (NTRS)
Simpson, Amy A.; Wilson, Jennifer G.; Brown, Robert G.
2015-01-01
Data from multiple sources is needed to investigate lightning characteristics over differing terrain (on-shore vs. off-shore) by comparing natural cloud-to-ground lightning behavior differences depending on the characteristics of attachment mediums. The KSC Lightning Research Database (KLRD) was created to reduce manual data entry time and aid research by combining information from various data sources into a single record for each unique lightning event of interest. The KLRD uses automatic data handling functions to import data from a lightning detection network and identify and record lighting events of interest. Additional automatic functions import data from the NASA Buoy 41009 (located approximately 20 miles off the coast) and the KSC Electric Field Mill network, then match these electric field mill values to the corresponding lightning events. The KLRD calculates distances between each lightning event and the various electric field mills, aids in identifying the location type for each stroke (i.e., on-shore vs. off-shore, etc.), provides statistics on the number of strokes per flash, and produces customizable reports for quick retrieval and logical display of data. Data from February 2014 to date covers 48 unique storm dates with 2295 flashes containing 5700 strokes, of which 2612 are off-shore and 1003 are on-shore. The number of strokes per flash ranges from 1 to 22. The ratio of single to subsequent stroke flashes is 1.29 for off-shore strokes and 2.19 for on-shore strokes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winans, J.
The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.
NASA Astrophysics Data System (ADS)
Vernon, F.; Tytell, J.; Hedlin, M. A. H.; Walker, K.; Busby, R.; Woodward, R.
2012-04-01
Earthscope's USArray Transportable Array (TA) network serves as a real-time monitoring and recording platform for both seismic and weather phenomena. To date, most of the approximately 500 TA stations have been retrofitted with VTI SCP1000 MEMS barometric pressure gauges capable of recording data at 1 sample per second (sps). Additionally, over 300 of the TA stations have also been retrofitted with Setra 278 barometric gauges and NCPA infrasound sensors capable of recording data at 1 and 40 sps. While individual seismic events have been successfully researched via the TA network, observations of powerful weather events by the TA network have yet to be embraced by the scientific community. This presentation will focus on case studies involving severe weather passage across portions of the TA network throughout 2011 in order to highlight its viability as a platform for real-time weather monitoring and research. It will also highlight the coupling of atmospheric signals into the seismic observations. Examples of gust front passages and pressure couplets from severe thunderstorms will be presented, as will observations of multiple tornados occurred in the Spring of 2011. These data will demonstrate the overall viability of the TA network for monitoring severe weather events in real-time.
Lithium in Brachiopods - proxy for seawater evolution?
NASA Astrophysics Data System (ADS)
Gaspers, Natalie; Magna, Tomas; Tomasovych, Adam; Henkel, Daniela
2017-04-01
Marine biogenic carbonates have the potential to serve as a proxy for evolution of seawater chemistry. In order to compile a record of the past and recent δ7Li in the oceans, foraminifera shells, scleractinian corals and belemnites have been used. However, only a foraminifera-based record appears to more accurately reflect the Li isotope composition of ocean water. At present, this record is available for the Cenozoic with implications for major events during this period of time, including K/T event [1]. A record for the entire Phanerozoic has not yet been obtained. In order to extend this record to the more distant past, Li elemental/isotope systematics of brachiopods were investigated because these marine animals were already present in Early Cambrian oceans and because they are less sensitive to diagenesis-induced modifications due to their shell mineralogy (low-Mg calcite). The preliminary data indicates a species-, temperature- and salinity-independent behavior of Li isotopes in brachiopod shells. Also, no vital effects have been observed for different shell parts. The consistent offset of -4‰ relative to modern seawater is in accordance with experimental data [2]. Further data are now being collected for Cenozoic specimens to more rigorously test brachiopods as possible archives of past seawater in comparison to the existing foraminiferal records. [1] Misra & Froelich (2012) Science 335, 818-823 [2] Marriott et al. (2004) Chem Geol 212, 5-15
Yasukawa, Kazutaka; Nakamura, Kentaro; Fujinaga, Koichiro; Ikehara, Minoru; Kato, Yasuhiro
2017-09-12
Multiple transient global warming events occurred during the early Palaeogene. Although these events, called hyperthermals, have been reported from around the globe, geologic records for the Indian Ocean are limited. In addition, the recovery processes from relatively modest hyperthermals are less constrained than those from the severest and well-studied hothouse called the Palaeocene-Eocene Thermal Maximum. In this study, we constructed a new and high-resolution geochemical dataset of deep-sea sediments clearly recording multiple Eocene hyperthermals in the Indian Ocean. We then statistically analysed the high-dimensional data matrix and extracted independent components corresponding to the biogeochemical responses to the hyperthermals. The productivity feedback commonly controls and efficiently sequesters the excess carbon in the recovery phases of the hyperthermals via an enhanced biological pump, regardless of the magnitude of the events. Meanwhile, this negative feedback is independent of nannoplankton assemblage changes generally recognised in relatively large environmental perturbations.
Mazzali, Cristina; Paganoni, Anna Maria; Ieva, Francesca; Masella, Cristina; Maistrello, Mauro; Agostoni, Ornella; Scalvini, Simonetta; Frigerio, Maria
2016-07-08
Administrative data are increasingly used in healthcare research. However, in order to avoid biases, their use requires careful study planning. This paper describes the methodological principles and criteria used in a study on epidemiology, outcomes and process of care of patients hospitalized for heart failure (HF) in the largest Italian Region, from 2000 to 2012. Data were extracted from the administrative data warehouse of the healthcare system of Lombardy, Italy. Hospital discharge forms with HF-related diagnosis codes were the basis for identifying HF hospitalizations as clinical events, or episodes. In patients experiencing at least one HF event, hospitalizations for any cause, outpatient services utilization, and drug prescriptions were also analyzed. Seven hundred one thousand, seven hundred one heart failure events involving 371,766 patients were recorded from 2000 to 2012. Once all the healthcare services provided to these patients after the first HF event had been joined together, the study database totalled about 91 million records. Principles, criteria and tips utilized in order to minimize errors and characterize some relevant subgroups are described. The methodology of this study could represent the basis for future research and could be applied in similar studies concerning epidemiology, trend analysis, and healthcare resources utilization.
Tools for educational access to seismic data
NASA Astrophysics Data System (ADS)
Taber, J. J.; Welti, R.; Bravo, T. K.; Hubenthal, M.; Frechette, K.
2017-12-01
Student engagement can be increased both by providing easy access to real data, and by addressing newsworthy events such as recent large earthquakes. IRIS EPO has a suite of access and visualization tools that can be used for such engagement, including a set of three tools that allow students to explore global seismicity, use seismic data to determine Earth structure, and view and analyze near-real-time ground motion data in the classroom. These tools are linked to online lessons that are designed for use in middle school through introductory undergraduate classes. The IRIS Earthquake Browser allows discovery of key aspects of plate tectonics, earthquake locations (in pseudo 3D) and seismicity rates and patterns. IEB quickly displays up to 20,000 seismic events over up to 30 years, making it one of the most responsive, practical ways to visualize historical seismicity in a browser. Maps are bookmarkable and preserve state, meaning IEB map links can be shared or worked into a lesson plan. The Global Seismogram Plotter automatically creates visually clear seismic record sections from selected large earthquakes that are tablet-friendly and can also to be printed for use in a classroom without computers. The plots are designed to be appropriate for use with no parameters to set, but users can also modify the plots, such as including a recording station near a chosen location. A guided exercise is provided where students use the record section to discover the diameter of Earth's outer core. Students can pick and compare phase arrival times onscreen which is key to performing the exercise. A companion station map shows station locations and further information and is linked to the record section. jAmaSeis displays seismic data in real-time from either a local instrument and/or from remote seismic stations that stream data using standard seismic data protocols, and can be used in the classroom or as a public display. Users can filter data, fit a seismogram to travel time curves, triangulate event epicenters on a globe, estimate event magnitudes, and generate images showing seismograms and corresponding calculations. All three tools access seismic databases curated by IRIS Data Services. In addition, jAmaseis also can access data from non-IRIS sources.
Second Quarter Hanford Seismic Report for Fiscal Year 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.
2010-06-30
The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 90 local earthquakes during the second quarter of FY 2010. Eighty-one of these earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this quarter were a continuation of the swarm events observed during the 2009 and 2010 fiscal years and reported in previous quarterly and annual reports (Rohay et al; 2009a, 2009b, 2009c, and 2010). Most of the events were considered minor (coda-length magnitude [Mc] less than 1.0) with only 1 event in the 2.0-3.0 range; the maximum magnitude event (3.0 Mc) occurred February 4, 2010 at depth 2.4 km. The average depth of the Wooded Island events during the quarter was 1.6 km with a maximum depth estimated at 3.5 km. This placed the Wooded Island events within the Columbia River Basalt Group (CRBG). The low magnitude of the Wooded Island events has made them undetectable to all but local area residents. The Hanford Strong Motion Accelerometer (SMA) network was triggered several times by these events and the SMA recordings are discussed in section 6.0. During the last year some Hanford employees working within a few miles of the swarm area and individuals living directly across the Columbia River from the swarm center have reported feeling many of the larger magnitude events. Similar earthquake swarms have been recorded near this same location in 1970, 1975 and 1988 but not with SMA readings or satellite imagery. Prior to the 1970s, earthquake swarms may have occurred at this location or elsewhere in the Columbia Basin, but equipment was not in place to record those events. The Wooded Island swarm, due its location and the limited magnitude of the events, does not appear to pose any significant risk to Hanford waste storage facilities. Since swarms of the past did not intensify in magnitude, seismologists do not expect that these events will persist or increase in intensity. However, Pacific Northwest National Laboratory (PNNL) will continue to monitor the activity. Outside of the Wooded Island swarm, nine earthquakes were recorded, seven minor events plus two events with magnitude less than 2.0 Mc. Two earthquakes were located at shallow depths (less than 4 km), three earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and four earthquakes were located at depths greater than 9 km, within the basement. Geographically, six earthquakes were located in known swarm areas and three earthquakes were classified as random events.« less
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 6 2011-10-01 2011-10-01 false Data elements. 563.7 Section 563.7 Transportation..., DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.7 Data elements. Link to an amendment published at 76 FR 47486, Aug. 5, 2011. (a) Data elements required for all vehicles. Each vehicle equipped with an...
49 CFR 563.12 - Data retrieval tools.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 6 2011-10-01 2011-10-01 false Data retrieval tools. 563.12 Section 563.12... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.12 Data retrieval tools. Each... tool(s) is commercially available that is capable of accessing and retrieving the data stored in the...
NASA Astrophysics Data System (ADS)
Urban, F. E.; Reynolds, R. L.; Neff, J. C.; Fernandez, D. P.; Reheis, M. C.; Goldstein, H.; Grote, E.; Landry, C.
2012-12-01
Improved measurement and observation of dust emission and deposition in the American west would advance understanding of (1) landscape conditions that promote or suppress dust emission, (2) dynamics of dryland and montane ecosystems, (3) premature melting of snow cover that provides critical water supplies, and (4) possible effects of dust on human health. Such understanding can be applied to issues of land management, water-resource management, as well as the safety and well-being of urban and rural inhabitants. We have recently expanded the scope of particulate measurement in the Upper Colorado River basin through the establishment of total-suspended-particulate (TSP) measurement stations located in Utah and Colorado with bi-weekly data (filter) collection, along with protocols for characterizing dust-on-snow (DOS) layers in Colorado mountains. A sub-network of high-resolution digital cameras has been co-located with several of the TSP stations, as well as at other strategic locations. These real-time regional dust-event detection cameras are internet-based and collect digital imagery every 6-15 minutes. Measurements of meteorological conditions to support these collections and observations are provided partly by CLIM-MET stations, four of which were deployed in 1998 in the Canyonlands (Utah) region. These stations provide continuous, near real-time records of the complex interaction of wind, precipitation, vegetation, as well as dust emission and deposition, in different land-use settings. The complementary datasets of dust measurement and observation enable tracking of individual regional dust events. As an example, the first DOS event of water year 2012 (Nov 5, 2011), as documented at Senator Beck Basin, near Silverton, Colorado, was also recorded by the camera at Island-in-the-Sky (200 km to the northwest), as well as in aeolian activity and wind data from the Dugout Ranch CLIM-MET station (170 km to the west-northwest). At these sites, strong winds and the presence of dense dust preceded precipitation. Similar conditions and results were recorded in many subsequent water year 2012 DOS events, with complementary quantification in TSP dust-flux records. Spring 2012 included several intense dry (no associated precipitation) regional dust events that occurred after snowmelt. These events during May 25-26, especially, are clearly evident in the imagery, TSP, and local meteorological data.
Integrated database for rapid mass movements in Norway
NASA Astrophysics Data System (ADS)
Jaedicke, C.; Lied, K.; Kronholm, K.
2009-03-01
Rapid gravitational slope mass movements include all kinds of short term relocation of geological material, snow or ice. Traditionally, information about such events is collected separately in different databases covering selected geographical regions and types of movement. In Norway the terrain is susceptible to all types of rapid gravitational slope mass movements ranging from single rocks hitting roads and houses to large snow avalanches and rock slides where entire mountainsides collapse into fjords creating flood waves and endangering large areas. In addition, quick clay slides occur in desalinated marine sediments in South Eastern and Mid Norway. For the authorities and inhabitants of endangered areas, the type of threat is of minor importance and mitigation measures have to consider several types of rapid mass movements simultaneously. An integrated national database for all types of rapid mass movements built around individual events has been established. Only three data entries are mandatory: time, location and type of movement. The remaining optional parameters enable recording of detailed information about the terrain, materials involved and damages caused. Pictures, movies and other documentation can be uploaded into the database. A web-based graphical user interface has been developed allowing new events to be entered, as well as editing and querying for all events. An integration of the database into a GIS system is currently under development. Datasets from various national sources like the road authorities and the Geological Survey of Norway were imported into the database. Today, the database contains 33 000 rapid mass movement events from the last five hundred years covering the entire country. A first analysis of the data shows that the most frequent type of recorded rapid mass movement is rock slides and snow avalanches followed by debris slides in third place. Most events are recorded in the steep fjord terrain of the Norwegian west coast, but major events are recorded all over the country. Snow avalanches account for most fatalities, while large rock slides causing flood waves and huge quick clay slides are the most damaging individual events in terms of damage to infrastructure and property and for causing multiple fatalities. The quality of the data is strongly influenced by the personal engagement of local observers and varying observation routines. This database is a unique source for statistical analysis including, risk analysis and the relation between rapid mass movements and climate. The database of rapid mass movement events will also facilitate validation of national hazard and risk maps.
International seismological data center: Preparation of an experimental data base
NASA Astrophysics Data System (ADS)
Israelson, H.; Jeppsson, I.; Barkeby, G.
1980-11-01
An experimental data base compiled for a temporary international seismological data center is presented. Data include recording and measurements at 60 globally distributed seismological stations for a one week period. Data for definition, location and magnitude estimation of seismic events are examined. Original digital records from 11 seismological research observatories around the world are also analyzed to provide additional identification data. It is shown that the routine measurement and reporting of data at seismological stations as proposed by the Seismic Experts Group at the UN Committee of Disarmament, is an onerous task that goes far beyond current seismological practices.
Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko
2016-07-01
A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation of the induced microseismicity and novel insights into dynamic rupture processes based on the average temporal (foreshock-aftershock) relationship of child events to parents.
Third Quarter Hanford Seismic Report for Fiscal Year 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.
2009-09-30
The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 771 local earthquakes during the third quarter of FY 2009. Nearly all of these earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this quarter is a continuation of the swarm events observed during the January – March 2009 time period and reported in the previous quarterly report (Rohay et al, 2009). The frequency of Wooded Island events has subsided with 16 events recorded during June 2009. Most of the events were considered minor (magnitude (Mc) less than 1.0) with 25 events in the 2.0-3.0 range. The estimated depths of the Wooded Island events are shallow (averaging less than 1.0 km deep) with a maximum depth estimated at 2.2 km. This places the Wooded Island events within the Columbia River Basalt Group (CRBG). The low magnitude of the Wooded Island events has made them undetectable to all but local area residents. However, some Hanford employees working within a few miles of the area of highest activity and individuals living in homes directly across the Columbia River from the swarm center have reported feeling many of the larger magnitude events. The Hanford Strong Motion Accelerometer (SMA) network was triggered numerous times by the Wooded Island swarm events. The maximum acceleration value recorded by the SMA network was approximately 3 times lower than the reportable action level for Hanford facilities (2% g) and no action was required. The swarming is likely due to pressure that has built up, cracking the brittle basalt layers within the Columbia River Basalt Formation (CRBG). Similar earthquake “swarms” have been recorded near this same location in 1970, 1975 and 1988. Prior to the 1970s, swarming may have occurred, but equipment was not in place to record those events. Quakes of this limited magnitude do not pose a risk to Hanford cleanup efforts or waste storage facilities. Since swarms of the past did not intensify in magnitude, seismologists do not expect that these events will increase in intensity. However, Pacific Northwest National Laboratory (PNNL) will continue to monitor the activity.« less
Heinrich events and sea level changes: records from uplifted coral terraces and marginal seas
NASA Astrophysics Data System (ADS)
Yokoyama, Y.; Esat, T. M.; Suga, H.; Obrochta, S.; Ohkouchi, N.
2017-12-01
Repeated major ice discharge events spaced every ca.7,000 years during the last ice age was first detected in deep sea sediments from North Atlantic. Characterized as lithic layers, these Heinrich Events (Heinrich, 1988 QR) correspond to rapid climate changes attributed to weakened ocean circulation (eg., Broecker, 1994 Nature; Alley, 1998 Nature) as shown by a number of different proxies. A better understanding of the overall picture of Heinrich events would benefit from determining the total amount of ice involved each event, which is still under debate. Sea level records are the most direct means for that, and uranium series dated corals can constrain the timing precisely. However, averaged global sea level during the time of interest was around -70m, hindering study from tectonically stable regions. Using uplifted coral terraces that extend 80 km along the Huon Peninsula, Papua New Guinea, the magnitude of sea level change during Heinrich Events was successfully reconstructed (Yokoyama et al., 2001 EPSL; Chappell et al., 1996 EPSL; Cutler et al., 2003). The H3 and H5 events are also well correlated with continuous sea level reconstructions using Red Sea oxygen isotope records (Siddall et al., 2003 Nature; Yokoyama and Esat, 2011Oceanography). Global ice sheet growth after 30 ka complicates interpretation of the Huon Peninsula record. However oxygen isotope data from the Japan Sea, a restricted margin sea with a shallow sill depth similar to the Red Sea, clearly captures the episode of H2 sea level change. The timing of these sea level excursions correlate well to the DSDP Site 609 detrital layers that are anchored in the latest Greenland ice core chronology (Obrochta et al., 2012 QSR). In the presentation, Antarctic ice sheet behavior during the H2 event will also be discussed using marginal seas oxygen records.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dighe, Kalpak Arvind
Several 40+ hour data records obtained in Oct 2010 from the Los Alamos Portable Pulser Facility (LAPP) Operational clocks show variations of {approx} 27 nsec. Several 16+ hour data records obtained in Aug 2010 from non-operational clocks like those used operationally from 2005 to the present show variations of {approx} 35 nsec. SLRE variability is xxx +/- yyy sec (std dev). SLRE occasionally show unusual events such as those discussed by Pongratz. We will continue to study and monitor.
Demanuele, Charmaine; James, Christopher J; Sonuga-Barke, Edmund Js
2007-12-10
It has been acknowledged that the frequency spectrum of measured electromagnetic (EM) brain signals shows a decrease in power with increasing frequency. This spectral behaviour may lead to difficulty in distinguishing event-related peaks from ongoing brain activity in the electro- and magnetoencephalographic (EEG and MEG) signal spectra. This can become an issue especially in the analysis of low frequency oscillations (LFOs) - below 0.5 Hz - which are currently being observed in signal recordings linked with specific pathologies such as epileptic seizures or attention deficit hyperactivity disorder (ADHD), in sleep studies, etc. In this work we propose a simple method that can be used to compensate for this 1/f trend hence achieving spectral normalisation. This method involves filtering the raw measured EM signal through a differentiator prior to further data analysis. Applying the proposed method to various exemplary datasets including very low frequency EEG recordings, epileptic seizure recordings, MEG data and Evoked Response data showed that this compensating procedure provides a flat spectral base onto which event related peaks can be clearly observed. Findings suggest that the proposed filter is a useful tool for the analysis of physiological data especially in revealing very low frequency peaks which may otherwise be obscured by the 1/f spectral activity inherent in EEG/MEG recordings.
[Assessing the economic impact of adverse events in Spanish hospitals by using administrative data].
Allué, Natalia; Chiarello, Pietro; Bernal Delgado, Enrique; Castells, Xavier; Giraldo, Priscila; Martínez, Natalia; Sarsanedas, Eugenia; Cots, Francesc
2014-01-01
To evaluate the incidence and costs of adverse events registered in an administrative dataset in Spanish hospitals from 2008 to 2010. A retrospective study was carried out that estimated the incremental cost per episode, depending on the presence of adverse events. Costs were obtained from the database of the Spanish Network of Hospital Costs. This database contains data from 12 hospitals that have costs per patient records based on activities and clinical records. Adverse events were identified through the Patient Safety Indicators (validated in the Spanish Health System) created by the Agency for Healthcare Research and Quality together with indicators of the EuroDRG European project. This study included 245,320 episodes with a total cost of 1,308,791,871€. Approximately 17,000 patients (6.8%) experienced an adverse event, representing 16.2% of the total cost. Adverse events, adjusted by diagnosis-related groups, added a mean incremental cost of between €5,260 and €11,905. Six of the 10 adverse events with the highest incremental cost were related to surgical interventions. The total incremental cost of adverse events was € 88,268,906, amounting to an additional 6.7% of total health expenditure. Assessment of the impact of adverse events revealed that these episodes represent significant costs that could be reduced by improving the quality and safety of the Spanish Health System. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Vafadari, A.; Philip, G.; Jennings, R.
2017-08-01
In recent decades, and in response to an increased focus on disastrous events ranging from armed conflict to natural events that impact cultural heritage, there is a need for methodologies and approaches to better manage the effects of disaster on cultural heritage. This paper presents the approaches used in the development of a Historic Environment Record (HER) for Syria. It describes the requirements and methodologies used for systematic emergency recording and assessment of cultural heritage. It also presents the type of information needed to record in the aftermath of disaster to assess the scale of damage and destruction. Started as a project at Durham University, the database is now being developed as part of the EAMENA (Endangered Archaeology in the Middle East and North Africa) project. The core dataset incorporates information and data from archaeological surveys undertaken in Syria by research projects in recent decades and began life as a development of the Shirīn initiative1. The focus of this project is to provide a tool not only for the recording and inventory of sites and monuments, but also to record damage and threats, their causes, and assess their magnitude. It will also record and measure the significance in order to be able to prioritize emergency and preservation responses. The database aims to set procedures for carrying out systematic rapid condition assessment (to record damage) and risk assessment (to record threat and level of risk) of heritage places, on the basis of both on the ground and remote assessment. Given the large number of heritage properties damaged by conflict, the implementation of rapid assessment methods to quickly identify and record level of damage and condition is essential, as it will provide the evidence to support effective prioritization of efforts and resources, and decisions on the appropriate levels of intervention and methods of treatment. The predefined data entry categories, use of a data standard, and systematic methods of assessment will ensure that different users choose from the same prefixed data entry and measurement inputs in order to allow for consistent and comparable assessments across different sites and regions. Given the general lack of appropriate emergency response and assessment databases, this system could also be applied in other locations facing similar threats and damage from conflict or natural disasters.
The Role of Sea Ice for Vascular Plant Dispersal in the Arctic
NASA Astrophysics Data System (ADS)
Geirsdottir, A.; Alsos, I. G.; Seidenkrantz, M. S.; Bennike, O.; Kirchhefer, A.; Ehrich, D.
2015-12-01
Plant species adapted to arctic environments are expected to go extinct at their southern margins due to climate warming whereas they may find suitable habitats on arctic islands if they are able to disperse there. Analyses of species distribution and phylogenetic data indicate both that the frequency of dispersal events is higher in the arctic than in other regions, and that the dispersal routes often follow the routes of sea surface currents. Thus, it has been hypothesised that sea ice has played a central role in Holocene colonisation of arctic islands. Here we compile data on the first Holocene occurrence of species in East Greenland, Iceland, the Faroe Islands, and Svalbard. We then combine these records with interpretations of dispersal routes inferred from genetic data and data on geographical distributions, reconstructions of Holocene sea ice extent, and records of driftwood to evaluate the potential role sea ice has played in past colonisation events.
Muon and neutron observations in connection with the corotating interaction regions
NASA Astrophysics Data System (ADS)
da Silva, M. R.; Dal Lago, A.; Echer, E.; de Lucas, A.; Gonzalez, W. D.; Schuch, N. J.; Munakata, K.; Vieira, L. E. A.; Guarnieri, F. L.
Ground cosmic ray observations are used for studying several kinds of interplanetary structures. The cosmic ray data has different responses to each kind of interplanetary structure. This article has as objective to study cosmic ray muon and neutron signatures due to the passage of corotating interaction region (CIR) in the interplanetary medium, and identify the signatures in the cosmic ray data due to these events. The cosmic ray muon data used in this work are recorded by the multidirectional muon detector installed at INPE’s Observatório Espacial do Sul OES/CRSPE/INPE-MCT, in São Martinho da Serra, RS (Brazil) and the neutron data was recorded by the neutron monitor installed in Newark (USA). The CIR events were selected in the period from 2001 to 2004. CIRs clearly affect cosmic ray density in the interplanetary medium in the Earth’s vicinity, where the magnetic field plays an important role.
NASA Astrophysics Data System (ADS)
Panning, M. P.; Banerdt, W. B.; Beucler, E.; Blanchette-Guertin, J. F.; Boese, M.; Clinton, J. F.; Drilleau, M.; James, S. R.; Kawamura, T.; Khan, A.; Lognonne, P. H.; Mocquet, A.; van Driel, M.
2015-12-01
An important challenge for the upcoming InSight mission to Mars, which will deliver a broadband seismic station to Mars along with other geophysical instruments in 2016, is to accurately determine event locations with the use of a single station. Locations are critical for the primary objective of the mission, determining the internal structure of Mars, as well as a secondary objective of measuring the activity of distribution of seismic events. As part of the mission planning process, a variety of techniques have been explored for location of marsquakes and inversion of structure, and preliminary procedures and software are already under development as part of the InSight Mars Quake and Mars Structure Services. One proposed method, involving the use of recordings of multiple-orbit surface waves, has already been tested with synthetic data and Earth recordings. This method has the strength of not requiring an a priori velocity model of Mars for quake location, but will only be practical for larger events. For smaller events where only first orbit surface waves and body waves are observable, other methods are required. In this study, we implement a transdimensional Bayesian inversion approach to simultaneously invert for basic velocity structure and location parameters (epicentral distance and origin time) using only measurements of body wave arrival times and dispersion of first orbit surface waves. The method is tested with synthetic data with expected Mars noise and Earth data for single events and groups of events and evaluated for errors in both location and structural determination, as well as tradeoffs between resolvable parameters and the effect of 3D crustal variations.
a Marine Record of Holocene Climate Events in Tropical South America
NASA Astrophysics Data System (ADS)
Haug, G. H.; Günther, D.; Hughen, K. A.; Peterson, L. C.; Röhl, U.
2002-12-01
Metal concentration data (Ti, Fe) from the anoxic Cariaco Basin off the Venezuelan coast record with subdecadal to seasonal resolution variations in the hydrological cycle over tropical South America during the last 14 ka. Following a dry Younger Dryas, a period of increased precipitation and riverine discharge occurred during the Holocene `thermal maximum'. Since ~5.4 ka, a trend towards drier conditions is evident from the data, with high amplitude fluctuations and precipitation minima during the time interval 3.8 to 2.8 ka and during the `Little Ice Age'. O pronouced increase in precipitation coincides with the phase sometimes referred to as the `Medieval Warm Period'. These regional changes in precipitation are best explained by shifts in the mean latitude of the Atlantic Intertropical Convergence Zone (ITCZ), potentially driven by Pacific-based climate variability. The variations recorded in Cariaco Basin sediments coincide with events in societal evolution that have been suggested previously to be motivated by environmental change. Regionally, the Cariaco record supports the notion that the collapse of this civilization between 800 and 1000 AD coincided with an extended period of drier conditions, implying that the rapid growth of Mayan culture from 600 to 800 AD may have resulted in a population operating at the fringes of the environment's carrying capacity. The Cariaco Basin record also hints at tropical climate events similar in timing to high latitude changes in the North Atlantic often invoked as pivotal to societal developments in Europe.
[Adverse events in general surgery. A prospective analysis of 13,950 consecutive patients].
Rebasa, Pere; Mora, Laura; Vallverdú, Helena; Luna, Alexis; Montmany, Sandra; Romaguera, Andreu; Navarro, Salvador
2011-11-01
Adverse event (AE) rates in General Surgery vary, according to different authors and recording methods, between 2% and 30%. Six years ago we designed a prospective AE recording system to change patient safety culture in our Department. We present the results of this work after a 6 year follow-up. The AE, sequelae and health care errors in a University Hospital surgery department were recorded. An analysis of each incident recorded was performed by a reviewer. The data was entered into data base for rapid access and consultation. The results were routinely presented in Departmental morbidity-mortality sessions. A total of 13,950 patients had suffered 11,254 AE, which affected 5142 of them (36.9% of admissions). A total of 920 patients were subjected to at least one health care error (6.6% of admissions). This meant that 6.6% of our patients suffered an avoidable AE. The overall mortality at 5 years in our department was 2.72% (380 deaths). An adverse event was implicated in the death of the patient in 180 cases (1.29% of admissions). In 49 cases (0.35% of admissions), mortality could be attributed to an avoidable AE. After 6 years there tends to be an increasingly lower incidence of errors. The exhaustive and prospective recording of AE leads to changes in patient safety culture in a Surgery Department and helps decrease the incidence of health care errors. Copyright © 2011 AEC. Published by Elsevier Espana. All rights reserved.
2012-01-01
Background Primary care records from the UK have frequently been used to identify episodes of upper gastrointestinal bleeding in studies of drug toxicity because of their comprehensive population coverage and longitudinal recording of prescriptions and diagnoses. Recent linkage within England of primary and secondary care data has augmented this data but the timing and coding of concurrent events, and how the definition of events in linked data effects occurrence and 28 day mortality is not known. Methods We used the recently linked English Hospital Episodes Statistics and General Practice Research Database, 1997–2010, to define events by; a specific upper gastrointestinal bleed code in either dataset, a specific bleed code in both datasets, or a less specific but plausible code from the linked dataset. Results This approach resulted in 81% of secondary care defined bleeds having a corresponding plausible code within 2 months in primary care. However only 62% of primary care defined bleeds had a corresponding plausible HES admission within 2 months. The more restrictive and specific case definitions excluded severe events and almost halved the 28 day case fatality when compared to broader and more sensitive definitions. Conclusions Restrictive definitions of gastrointestinal bleeding in linked datasets fail to capture the full heterogeneity in coding possible following complex clinical events. Conversely too broad a definition in primary care introduces events not severe enough to warrant hospital admission. Ignoring these issues may unwittingly introduce selection bias into a study’s results. PMID:23148590
NASA Astrophysics Data System (ADS)
Brugman, K. K.; Till, C. B.
2017-12-01
The goal of our research is to quantify the time period between events in the magma chamber and eruption for the Scaup Lake rhyolite lava, as it erupted after a period of quiescence similar to what Yellowstone is experiencing today. The overarching goal of studies such as this that focus on past eruptions is to provide context and statistics that will ultimately improve volcano monitoring at different types of active volcanoes. The Scaup Lake flow contains zoned minerals (e.g., feldspar, zircon, clinopyroxene) that record multiple magma injection events shortly before they were erupted. Our previous work using nano-scale elemental concentration profiles from zoned clinopyroxene (cpx) as a diffusion dating tool reinforced our hypothesis that different minerals may not record the same series of pre-eruptive events, and that cpx crystal rims record older events in the Scaup Lake flow (on the order of 100s of years prior to eruption [Brugman et al., AGU OSPA talk, 2016]) than do feldspar rims (< 10 months and 10-40 years prior to eruption [Till et al., Geology, 2015]). In light of new temperature data, we have updated our diffusion dating results to better quantify pre-eruption timescales at Yellowstone.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-08-01
SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.
ARTS. Accountability Reporting and Tracking System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, J.F.; Faccio, R.M.
ARTS is a micro based prototype of the data elements, screens, and information processing rules that apply to the Accountability Reporting Program. The system focuses on the Accountability Event. The Accountability Event is an occurrence of incurring avoidable costs. The system must be able to CRUD (Create, Retrieve, Update, Delete) instances of the Accountability Event. Additionally, the system must provide for a review committee to update the `event record` with findings and determination information. Lastly, the system must provide for financial representatives to perform a cost reporting process.
NASA Astrophysics Data System (ADS)
Wu, J.; Zhou, J.; Shen, B.; Zeng, H.
2017-12-01
Global climate change has the potential to accelerate the hydrological cycle, which may further enhance the temporal frequency of regional extreme floods. Climatic models predict that intra-annual rainfall variability will intensify, which will shift current rainfall regimes towards more extreme systems with lower precipitation frequencies, longer dry periods, and larger individual precipitation events worldwide. Understanding the temporal variations of extreme floods that occur in response to climate change is essential to anticipate the trends in flood magnitude and frequency in the context of global warming. However, currently available instrumental data are not long enough for capturing the most extreme events, thus the acquisition of long duration datasets for historical floods that extend beyond available instrumental records is clearly an important step in discerning trends in flood frequency and magnitude with respect to climate change. In this study, a reconstruction of paleofloods over the past 300 years was conducted through an analysis of grain sizes from the sediments of Kanas Lake in the Altay Mountains of northwestern China. Grain parameters and frequency distributions both demonstrate that two abrupt environment changes exist within the lake sedimentary sequence. Based on canonical discriminant analysis (CDA) and C-M pattern analysis, two flood events corresponding to ca. 1760 AD and ca. 1890 AD were identified, both of which occurred during warmer and wetter climate conditions according to tree-ring records. These two flood events are also evidenced by lake sedimentary records in the Altay and Tianshan areas. Furthermore, through a comparison with other records, the flood event in ca. 1760 AD seems to have occurred in both the arid central Asia and the Alps in Europe, and thus may have been associated with changes in the North Atlantic Oscillation (NAO) index.
Oliva, Elizabeth M; Bowe, Thomas; Tavakoli, Sara; Martins, Susana; Lewis, Eleanor T; Paik, Meenah; Wiechers, Ilse; Henderson, Patricia; Harvey, Michael; Avoundjian, Tigran; Medhanie, Amanuel; Trafton, Jodie A
2017-02-01
Concerns about opioid-related adverse events, including overdose, prompted the Veterans Health Administration (VHA) to launch an Opioid Safety Initiative and Overdose Education and Naloxone Distribution program. To mitigate risks associated with opioid prescribing, a holistic approach that takes into consideration both risk factors (e.g., dose, substance use disorders) and risk mitigation interventions (e.g., urine drug screening, psychosocial treatment) is needed. This article describes the Stratification Tool for Opioid Risk Mitigation (STORM), a tool developed in VHA that reflects this holistic approach and facilitates patient identification and monitoring. STORM prioritizes patients for review and intervention according to their modeled risk for overdose/suicide-related events and displays risk factors and risk mitigation interventions obtained from VHA electronic medical record (EMR)-data extracts. Patients' estimated risk is based on a predictive risk model developed using fiscal year 2010 (FY2010: 10/1/2009-9/30/2010) EMR-data extracts and mortality data among 1,135,601 VHA patients prescribed opioid analgesics to predict risk for an overdose/suicide-related event in FY2011 (2.1% experienced an event). Cross-validation was used to validate the model, with receiver operating characteristic curves for the training and test data sets performing well (>.80 area under the curve). The predictive risk model distinguished patients based on risk for overdose/suicide-related adverse events, allowing for identification of high-risk patients and enrichment of target populations of patients with greater safety concerns for proactive monitoring and application of risk mitigation interventions. Results suggest that clinical informatics can leverage EMR-extracted data to identify patients at-risk for overdose/suicide-related events and provide clinicians with actionable information to mitigate risk. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Nevalainen, Jouni; Kozlovskaya, Elena
2016-04-01
We present results of a seismic travel-time tomography applied to microseismic data from the Pyhäsalmi mine, Finland. The data about microseismic events in the mine is recorded since 2002 when the passive microseismic monitoring network was installed in the mine. Since that over 130000 microseismic events have been observed. The first target of our study was to test can the passive microseismic monitoring data be used with travel-time tomography. In this data set the source-receiver geometry is based on non-even distribution of natural and mine-induced events inside and in the vicinity of the mine and hence, is a non-ideal one for the travel-time tomography. The tomographic inversion procedure was tested with the synthetic data and real source-receiver geometry from Pyhäsalmi mine and with the real travel-time data of the first arrivals of P-waves from the microseismic events. The results showed that seismic tomography is capable to reveal differences in seismic velocities in the mine area corresponding to different rock types. For example, the velocity contrast between the ore body and surrounding rock is detectable. The velocity model recovered agrees well with the known geological structures in the mine area. The second target of the study was to apply the travel-time tomography to microseismic monitoring data recorded during different time periods in order to track temporal changes in seismic velocities within the mining area as the excavation proceeds. The result shows that such a time-lapse travel-time tomography can recover such changes. In order to obtain good ray coverage and good resolution, the time interval for a single tomography round need to be selected taking into account the number of events and their spatial distribution. The third target was to compare and analyze mine-induced event locations, seismic tomography results and mining technological data (for example, mine excavation plans) in order to understand the influence of mining technology to mining-induced seismicity. Acknowledgements: This study has been supported by ERDF SEISLAB project and Pyhäsalmi Mine Ltd.
Pesicek, Jeremy; Cieślik, Konrad; Lambert, Marc-André; Carrillo, Pedro; Birkelo, Brad
2016-01-01
We have determined source mechanisms for nine high-quality microseismic events induced during hydraulic fracturing of the Montney Shale in Canada. Seismic data were recorded using a dense regularly spaced grid of sensors at the surface. The design and geometry of the survey are such that the recorded P-wave amplitudes essentially map the upper focal hemisphere, allowing the source mechanism to be interpreted directly from the data. Given the inherent difficulties of computing reliable moment tensors (MTs) from high-frequency microseismic data, the surface amplitude and polarity maps provide important additional confirmation of the source mechanisms. This is especially critical when interpreting non-shear source processes, which are notoriously susceptible to artifacts due to incomplete or inaccurate source modeling. We have found that most of the nine events contain significant non-double-couple (DC) components, as evident in the surface amplitude data and the resulting MT models. Furthermore, we found that source models that are constrained to be purely shear do not explain the data for most events. Thus, even though non-DC components of MTs can often be attributed to modeling artifacts, we argue that they are required by the data in some cases, and can be reliably computed and confidently interpreted under favorable conditions.
Lightning Step Leader and Return Stroke Spectra at 100,000 fps
NASA Astrophysics Data System (ADS)
Harley, J.; McHarg, M.; Stenbaek-Nielsen, H. C.; Haaland, R. K.; Sonnenfeld, R.; Edens, H. E.; Cummer, S.; Lapierre, J. L.; Maddocks, S.
2017-12-01
A fundamental understanding of lightning can be inferred from the spectral emissions resulting from the leader and return stroke channels. We examine events recorded at 00:58:07 on 19 July 2015 and 06:44:24 on 23 July 2017, both at Langmuir Laboratory. Analysis of both events is supplemented by data from the Lightning Mapping Array at Langmuir. The 00:58:07 event spectra was recorded using a 100 line per mm grating in front of a Phantom V2010 camera with an 85mm (9o FOV) Nikon lens recording at 100,000 frames per second. Coarse resolution spectra (approximately 5 nm resolution) are produced from approximately 400 nm to 800 nm for each frame. We analyze several nitrogen and oxygen lines to understand step leader temperature behavior between cloud and ground. The 06:44:24 event spectra was recorded using a 300 line per mm grating (approximately 1.5 nm resolution) in front of a Phantom V2010 camera with an 50mm (32o FOV) Nikon lens also recording at 100,000 frames per second. Two ionized atomic nitrogen lines at 502 nm and 569 nm appear upon attachment and disappear as the return stroke travels from ground to cloud in approximately 5 frames. We analyze these lines to understand initial return stroke temperature and species behavior.
Learning the Cardiac Cycle: Simultaneous Observations of Electrical and Mechanical Events.
ERIC Educational Resources Information Center
Kenney, Richard Alec; Frey, Mary Anne Bassett
1980-01-01
Described is a method for integrating electrical and mechanical events of the cardiac cycle by measuring systolic time intervals, which involves simultaneous recording of the ECG, a phonocardiogram, and the contour of the carotid pulse. Both resting and stress change data are provided as bases for class discussion. (CS)
NASA Astrophysics Data System (ADS)
Knapmeyer-Endrun, B.; Hammer, C.
2014-12-01
The seismometers that the Apollo astronauts deployed on the Moon provide the only recordings of seismic events from any extra-terrestrial body so far. These lunar events are significantly different from ones recorded on Earth, in terms of both signal shape and source processes. Thus they are a valuable test case for any experiment in planetary seismology. In this study, we analyze Apollo 16 data with a single-station event detection and classification algorithm in view of NASA's upcoming InSight mission to Mars. InSight, scheduled for launch in early 2016, has the goal to investigate Mars' internal structure by deploying a seismometer on its surface. As the mission does not feature any orbiter, continuous data will be relayed to Earth at a reduced rate. Full range data will only be available by requesting specific time-windows within a few days after the receipt of the original transmission. We apply a recently introduced algorithm based on hidden Markov models that requires only a single example waveform of each event class for training appropriate models. After constructing the prototypes we detect and classify impacts and deep and shallow moonquakes. Initial results for 1972 (year of station installation with 8 months of data) indicate a high detection rate of over 95% for impacts, of which more than 80% are classified correctly. Deep moonquakes, which occur in large amounts, but often show only very weak signals, are detected with less certainty (~70%). As there is only one weak shallow moonquake covered, results for this event class are not statistically significant. Daily adjustments of the background noise model help to reduce false alarms, which are mainly erroneous deep moonquake detections, by about 25%. The algorithm enables us to classify events that were previously listed in the catalog without classification, and, through the combined use of long period and short period data, identify some unlisted local impacts as well as at least two yet unreported deep moonquakes.
The Incidence, Nature and Consequences of Adverse Events in Iranian Hospitals.
Akbari Sari, Ali; Doshmangir, Leila; Torabi, Fereshteh; Rashidian, Arash; Sedaghat, Mojtaba; Ghomi, Robabeh; Prasopa-Plaizier, Nittita
2015-12-01
Adverse events are relatively common in healthcare, leading to extensive harm to patients and a significant drain on healthcare resources. Identifying the extent, nature and consequences of adverse events is an important step in preventing adverse events and their consequences which is the subject of this study. This is a retrospective review of medical records randomly selected from patients admitted to 4 general hospitals, staying more than 24 hours and discharged between April and September 2012. We randomly selected 1200 records and completed the record review for 1162 of these records. Standard forms (RF1 and RF2) were used to review medical records in two stages by nurses and medical doctors. Eighty-five (7.3%) of the 1162 records had an adverse event during the admission; and in 43 (3.7%) of the 1162 records, the patient was admitted to the hospital due to an adverse event that occurred before the admission. Therefore, a total of 128 (11.0%) adverse events occurred in 126 (10.9) records as two patients had more than one adverse event. Forty-four (34.3%) of these 128 adverse events were considered preventable. This study confirms that adverse events, particularly adverse drug reactions, post-operative infections, bedsore and hospital acquired infections are common and potentially preventable sources of harm to patients in Iranian hospitals.
The Southern Oscillation recorded in the δ18O of corals from Tarawa Atoll
NASA Astrophysics Data System (ADS)
Cole, Julia E.; Fairbanks, Richard G.
1990-10-01
In the western equatorial Pacific, the El Niño/Southern Oscillation (ENSO) phenomenon is characterized by precipitation variability associated with the migration of the Indonesian low pressure cell to the region of the date line and the equator. During ENSO events, Tarawa Atoll (1°N, 172°E) experiences heavy rainfall which has an estimated δ18O of about -8 to -10‰ δ18OSMOW. At Tarawa, sufficient precipitation of this composition falls during ENSO events to alter the δ18O and the salinity of the surface waters. Oxygen isotope records from two corals collected off the reef crest of Tarawa reflect rainfall variations associated with both weak and strong ENSO conditions, with approximately monthly resolution. Coral skeletal δ18O variations due to small sea surface temperature (SST) changes are secondary. These records demonstrate the remarkable ability of this technique to reconstruct variations in the position of the Indonesian Low from coral δ18O records in the western equatorial Pacific, a region which has few paleoclimatic records. The coral isotopic data correctly resolve the relative magnitudes of recent variations in the Southern Oscillation Index. Combining the Tarawa record with an oxygen isotopic history from a Galápagos Islands coral demonstrates the ability to distinguish the meteorologic (precipitation) and oceanographic (SST) anomalies that characterize ENSO events across the Pacific Basin over the period of common record (1960-1979). Comparison of the intensity of climatic anomalies at these two sites yields insight into the spatial variability of ENSO events. Isotope records from older corals can provide high-resolution, Pacific-wide reconstructions of ENSO behavior during periods of different climate boundary conditions.
NASA Astrophysics Data System (ADS)
Trubilowicz, J. W.; Moore, D.
2015-12-01
Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.
ldentifying Episodes of Earth Science Phenomena Using a Big-Data Technology
NASA Technical Reports Server (NTRS)
Kuo, Kwo-Sen; Oloso, Amidu; Rushing, John; Lin, Amy; Fekete, Gyorgy; Ramachandran, Rahul; Clune, Thomas; Dunny, Daniel
2014-01-01
A significant portion of Earth Science investigations is phenomenon- (or event-) based, such as the studies of Rossby waves, volcano eruptions, tsunamis, mesoscale convective systems, and tropical cyclones. However, except for a few high-impact phenomena, e.g. tropical cyclones, comprehensive records are absent for the occurrences or events of these phenomena. Phenomenon-based studies therefore often focus on a few prominent cases while the lesser ones are overlooked. Without an automated means to gather the events, comprehensive investigation of a phenomenon is at least time-consuming if not impossible. We have constructed a prototype Automated Event Service (AES) system that is used to methodically mine custom-defined events in the reanalysis data sets of atmospheric general circulation models. Our AES will enable researchers to specify their custom, numeric event criteria using a user-friendly web interface to search the reanalysis data sets. Moreover, we have included a social component to enable dynamic formation of collaboration groups for researchers to cooperate on event definitions of common interest and for the analysis of these events. An Earth Science event (ES event) is defined here as an episode of an Earth Science phenomenon (ES phenomenon). A cumulus cloud, a thunderstorm shower, a rogue wave, a tornado, an earthquake, a tsunami, a hurricane, or an El Nino, is each an episode of a named ES phenomenon, and, from the small and insignificant to the large and potent, all are examples of ES events. An ES event has a duration (often finite) and an associated geo-location as a function of time; it's therefore an entity embedded in four-dimensional (4D) spatiotemporal space. Earth Science phenomena with the potential to cause massive economic disruption or loss of life often rivet the attention of researchers. But, broader scientific curiosity also drives the study of phenomena that pose no immediate danger, such as land/sea breezes. Due to Earth System's intricate dynamics, we are continuously discovering novel ES phenomena. We generally gain understanding of a given phenomenon by observing and studying individual events. This process usually begins by identifying the occurrences of these events. Once representative events are identified or found, we must locate associated observed or simulated data prior to commencing analysis and concerted studies of the phenomenon. Knowledge concerning the phenomenon can accumulate only after analysis has started. However, as mentioned previously, comprehensive records only exist for a very limited set of high-impact phenomena; aside from these, finding events and locating associated data currently may take a prohibitive amount of time and effort on the part of an individual investigator. The reason for the lack of comprehensive records for most of the ES phenomena is mainly due to the perception that they do not pose immediate and/or severe threat to life and property. Thus they are not consistently tracked, monitored, and catalogued. Many phenomena even lack precise and/or commonly accepted criteria for definitions. Moreover, various Earth Science observations and data have accumulated to a previously unfathomable volume; NASA Earth Observing System Data Information System (EOSDIS) alone archives several petabytes (PB) of satellite remote sensing data, which are steadily increasing. All of these factors contribute to the difficulty of methodically identifying events corresponding to a given phenomenon and significantly impede systematic investigations. We have not only envisioned AES as an environment for identifying customdefined events but also aspired for it to be an interactive environment with quick turnaround time for revisions of query criteria and results, as well as a collaborative environment where geographically distributed experts may work together on the same phenomena. A Big Data technology is thus required for the realization of such a system. In the following, we first introduce the technology selected for AES in the next section. We then demonstrate the utility of AES using a use case, Blizzard, before we conclude.
Towards Quantification of Glacier Dynamic Ice Loss through Passive Seismic Monitoring
NASA Astrophysics Data System (ADS)
Köhler, A.; Nuth, C.; Weidle, C.; Schweitzer, J.; Kohler, J.; Buscaino, G.
2015-12-01
Global glaciers and ice caps loose mass through calving, while existing models are currently not equipped to realistically predict dynamic ice loss. This is mainly because long-term continuous calving records, that would help to better understand fine scale processes and key climatic-dynamic feedbacks between calving, climate, terminus evolution and marine conditions, do not exist. Combined passive seismic/acoustic strategies are the only technique able to capture rapid calving events continuously, independent of daylight or meteorological conditions. We have produced such a continuous calving record for Kronebreen, a tidewater glacier in Svalbard, using data from permanent seismic stations between 2001 and 2014. However, currently no method has been established in cryo-seismology to quantify the calving ice loss directly from seismic data. Independent calibration data is required to derive 1) a realistic estimation of the dynamic ice loss unobserved due to seismic noise and 2) a robust scaling of seismic calving signals to ice volumes. Here, we analyze the seismic calving record at Kronebreen and independent calving data in a first attempt to quantify ice loss directly from seismic records. We make use of a) calving flux data with weekly to monthly resolution obtained from satellite remote sensing and GPS data between 2007 and 2013, and b) direct, visual calving observations in two weeks in 2009 and 2010. Furthermore, the magnitude-scaling property of seismic calving events is analyzed. We derive and discuss an empirical relation between seismic calving events and calving flux which for the first time allows to estimate a time series of calving volumes more than one decade back in time. Improving our model requires to incorporate more precise, high-resolution calibration data. A new field campaign will combine innovative, multi-disciplinary monitoring techniques to measure calving ice volumes and dynamic ice-ocean interactions simultaneously with terrestrial laser scanning and a temporary seismic/underwater-acoustic network.
Improved phase arrival estimate and location for local earthquakes in South Korea
NASA Astrophysics Data System (ADS)
Morton, E. A.; Rowe, C. A.; Begnaud, M. L.
2012-12-01
The Korean Institute of Geoscience and Mineral Resources (KIGAM) and the Korean Meteorological Agency (KMA) regularly report local (distance < ~1200 km) seismicity recorded with their networks; we obtain preliminary event location estimates as well as waveform data, but no phase arrivals are reported, so the data are not immediately useful for earthquake location. Our goal is to identify seismic events that are sufficiently well-located to provide accurate seismic travel-time information for events within the KIGAM and KMA networks, and also recorded by some regional stations. Toward that end, we are using a combination of manual phase identification and arrival-time picking, with waveform cross-correlation, to cluster events that have occurred in close proximity to one another, which allows for improved phase identification by comparing the highly correlating waveforms. We cross-correlate the known events with one another on 5 seismic stations and cluster events that correlate above a correlation coefficient threshold of 0.7, which reveals few clusters containing few events each. The small number of repeating events suggests that the online catalogs have had mining and quarry blasts removed before publication, as these can contribute significantly to repeating seismic sources in relatively aseismic regions such as South Korea. The dispersed source locations in our catalog, however, are ideal for seismic velocity modeling by providing superior sampling through the dense seismic station arrangement, which produces favorable event-to-station ray path coverage. Following careful manual phase picking on 104 events chosen to provide adequate ray coverage, we re-locate the events to obtain improved source coordinates. The re-located events are used with Thurber's Simul2000 pseudo-bending local tomography code to estimate the crustal structure on the Korean Peninsula, which is an important contribution to ongoing calibration for events of interest in the region.
In-situ trainable intrusion detection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Symons, Christopher T.; Beaver, Justin M.; Gillen, Rob
A computer implemented method detects intrusions using a computer by analyzing network traffic. The method includes a semi-supervised learning module connected to a network node. The learning module uses labeled and unlabeled data to train a semi-supervised machine learning sensor. The method records events that include a feature set made up of unauthorized intrusions and benign computer requests. The method identifies at least some of the benign computer requests that occur during the recording of the events while treating the remainder of the data as unlabeled. The method trains the semi-supervised learning module at the network node in-situ, such thatmore » the semi-supervised learning modules may identify malicious traffic without relying on specific rules, signatures, or anomaly detection.« less
NASA Astrophysics Data System (ADS)
Baranowski, Z.; Canali, L.; Toebbicke, R.; Hrivnac, J.; Barberis, D.
2017-10-01
This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions of event records, each of which consists of ∼100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. We report also on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, T.A.; Baker, D.F.; Edwards, C.L.
1993-10-01
Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less
NASA Astrophysics Data System (ADS)
Michaud, J. R.; Cullen, J. L.; McManus, J. F.; Oppo, D. W.
2004-05-01
Successful efforts to recover quality high sedimentation rate deep-sea sediment sections from the North Atlantic over the last decade have produced a number of studies demonstrating that climate instability at sub-orbital and even millennial time-scales is a pervasive component of Late Pleistocene North Atlantic climate. This is particularly true during Marine Isotope Stages (MIS) 4-2, i.e., the last glacial interval. One such high sedimentation rate section was recovered at ODP Site 980, Northeast Atlantic Ocean where sedimentation rates during MIS 4-2 exceed 15cm/kyr. Recently, we have begun to generate more detailed records from MIS 4-2 at Site 980 by reducing our sampling interval from 20 to around 2.5 cm, improving the resolution of our records an order of magnitude, from 1200-1300 to 100-200 years. 300 samples were used to generate high resolution records of changes in the input of ice-rafted detritus (IRD), along with limited data documenting changes in the relative abundance of the N. pachyderma, left coiling, which can be evaluated within the context of our previously generated lower resolution planktic and benthic oxygen isotope records used to generate our age model for this interval. Our previously published low resolution IRD record enabled us to identify Heinrich events 1-6 within the sediment interval deposited during the last glacial. Each event is characterized by IRD concentrations ranging from 500 to over 2500 lithic grains >150 microns per gram sediment. Superimposing our new high resolution IRD record reveals that Heinrich events 3,2,1 occurring at approximately 32, 23, and 17 kya, respectively, are each composed of a series of separate abrupt rapid increases in IRD concentrations approaching 1,000 grains per gram. An additional comparable event occurring at approximately 20 kya has also been identified. In the early part of the last glacial H6, H5, and H4 occurring at approximately 66, 47, and 38 kya, respectively, are recorded as much more abrupt and rapid increases in IRD concentrations to 2,000 or greater lithic grains per gram than were observed in our previous record. There are two 5 kyr intervals between H6 and H5 that contain little or no IRD. An additional abrupt IRD event is recorded at approximately 34 kya. Thus, our new IRD record is recording a series of additional episodic increases in IRD concentrations comparable in intensity to the identified Heinrich events. This suggests that ODP Site 980 sediments are recording a series of more closely spaced episodic increases in IRD concentration that can be directly related to the Dansgaard/Oeschger events recorded in Greenland ice cores. Comparison of our preliminary high resolution record of changes in the relative abundance of the polar species N. pachyderma, left coiling, to our IRD record suggests that the input of iceberg bearing waters precedes the increases in the relative abundance of N. pachyderma, left coiling for the early glacial IRD events. Whereas the abrupt increases in N. pachyderma, left coiling seem to occur during the later glacial IRD events. Thus, in the early glacial the influx of icebergs seem to occur before the invasion of cooler surface waters as opposed to the same time later in the glacial.
The Sensitivity of Adverse Event Cost Estimates to Diagnostic Coding Error
Wardle, Gavin; Wodchis, Walter P; Laporte, Audrey; Anderson, Geoffrey M; Baker, Ross G
2012-01-01
Objective To examine the impact of diagnostic coding error on estimates of hospital costs attributable to adverse events. Data Sources Original and reabstracted medical records of 9,670 complex medical and surgical admissions at 11 hospital corporations in Ontario from 2002 to 2004. Patient specific costs, not including physician payments, were retrieved from the Ontario Case Costing Initiative database. Study Design Adverse events were identified among the original and reabstracted records using ICD10-CA (Canadian adaptation of ICD10) codes flagged as postadmission complications. Propensity score matching and multivariate regression analysis were used to estimate the cost of the adverse events and to determine the sensitivity of cost estimates to diagnostic coding error. Principal Findings Estimates of the cost of the adverse events ranged from $16,008 (metabolic derangement) to $30,176 (upper gastrointestinal bleeding). Coding errors caused the total cost attributable to the adverse events to be underestimated by 16 percent. The impact of coding error on adverse event cost estimates was highly variable at the organizational level. Conclusions Estimates of adverse event costs are highly sensitive to coding error. Adverse event costs may be significantly underestimated if the likelihood of error is ignored. PMID:22091908
A, Golkari; A, Sabokseir; D, Blane; A, Sheiham; RG, Watt
2017-01-01
Statement of Problem: Early childhood is a crucial period of life as it affects one’s future health. However, precise data on adverse events during this period is usually hard to access or collect, especially in developing countries. Objectives: This paper first reviews the existing methods for retrospective data collection in health and social sciences, and then introduces a new method/tool for obtaining more accurate general and oral health related information from early childhood retrospectively. Materials and Methods: The Early Childhood Events Life-Grid (ECEL) was developed to collect information on the type and time of health-related adverse events during the early years of life, by questioning the parents. The validity of ECEL and the accuracy of information obtained by this method were assessed in a pilot study and in a main study of 30 parents of 8 to 11 year old children from Shiraz (Iran). Responses obtained from parents using the final ECEL were compared with the recorded health insurance documents. Results: There was an almost perfect agreement between the health insurance and ECEL data sets (Kappa value=0.95 and p < 0.001). Interviewees remembered the important events more accurately (100% exact timing match in case of hospitalization). Conclusions: The Early Childhood Events Life-Grid method proved to be highly accurate when compared with recorded medical documents. PMID:28959773
Big Data solution for CTBT monitoring: CEA-IDC joint global cross correlation project
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Bell, Randy; Brachet, Nicolas; Gaillard, Pierre; Kitov, Ivan; Rozhkov, Mikhail
2014-05-01
Waveform cross-correlation when applied to historical datasets of seismic records provides dramatic improvements in detection, location, and magnitude estimation of natural and manmade seismic events. With correlation techniques, the amplitude threshold of signal detection can be reduced globally by a factor of 2 to 3 relative to currently standard beamforming and STA/LTA detector. The gain in sensitivity corresponds to a body wave magnitude reduction by 0.3 to 0.4 units and doubles the number of events meeting high quality requirements (e.g. detected by three and more seismic stations of the International Monitoring System (IMS). This gain is crucial for seismic monitoring under the Comprehensive Nuclear-Test-Ban Treaty. The International Data Centre (IDC) dataset includes more than 450,000 seismic events, tens of millions of raw detections and continuous seismic data from the primary IMS stations since 2000. This high-quality dataset is a natural candidate for an extensive cross correlation study and the basis of further enhancements in monitoring capabilities. Without this historical dataset recorded by the permanent IMS Seismic Network any improvements would not be feasible. However, due to the mismatch between the volume of data and the performance of the standard Information Technology infrastructure, it becomes impossible to process all the data within tolerable elapsed time. To tackle this problem known as "BigData", the CEA/DASE is part of the French project "DataScale". One objective is to reanalyze 10 years of waveform data from the IMS network with the cross-correlation technique thanks to a dedicated High Performance Computer (HPC) infrastructure operated by the Centre de Calcul Recherche et Technologie (CCRT) at the CEA of Bruyères-le-Châtel. Within 2 years we are planning to enhance detection and phase association algorithms (also using machine learning and automatic classification) and process about 30 terabytes of data provided by the IDC to update the world seismicity map. From the new events and those in the IDC Reviewed Event Bulletin, we will automatically create various sets of master event templates that will be used for the event location globally by the CTBTO and CEA.
NASA Astrophysics Data System (ADS)
Gregory, L. C.; Meert, J. G.; Levashova, N.; Grice, W. C.; Gibsher, A.; Rybanin, A.
2007-12-01
The Neoproterozoic to early Paleozoic Ural-Mongol belt that runs through Central Asia is crucial for determining the enigmatic amalgamation of microcontinents that make up the Eurasian subcontinent. Two unique models have been proposed for the evolution of Ural-Mongol belt. One involves a complex assemblage of cratonic blocks that have collided and rifted apart during diachronous opening and closing of Neoproterozoic to Devonian aged ocean basins. The opposing model of Sengor and Natal"in proposes a long-standing volcanic arc system that connected Central Asian blocks with the Baltica continent. The Aktau-Mointy and Dzabkhan microcontinents in Kazakhstan and Central Mongolia make up the central section of the Ural-Mongol belt, and both contain glacial sequences characteristic of the hypothesized snowball earth event. These worldwide glaciations are currently under considerable debate, and paleomagnetic data from these microcontients are a useful contribution to the snowball controversy. We have sampled volcanic and sedimentary sequences in Central Mongolia, Kazakhstan and Kyrgyzstan for paleomagnetic and geochronologic study. U-Pb data, 13C curves and abundant fossil records place age constraints on sequences that contain glacial deposits of the hypothesized snowball earth events. Carbonates in the Zavkhan Basin in Mongolia are likely remagnetized, but fossil evidence within the sequence suggests a readjusted age control on two glacial events that were previously labeled as Sturtian and Marinoan. U-Pb ages from both Kazakhstan and Mongolian volcanic sequences imply a similar evolution history of the areas as part of the Ural-Mongol fold belt, and these ages paired with paleomagnetic and 13C records have important tectonic implications. We will present these data in order to place better constraints on the Precambrian to early Paleozoic tectonic evolution of Central Asia and the timing of glacial events recorded in the area.
History of the Adult Education Program of the City of Detroit.
ERIC Educational Resources Information Center
Skavery, Stanley
The development of adult education in the Detroit area during the years 1875-1932 was intimately tied to the social, political, and economic events of that time span. Data gleaned from census records, Board of Education minutes, old maps, street guides, labor legislation, educational legislation, church records, advertisements, alien…
Kaiser, Lee D; Melemed, Allen S; Preston, Alaknanda J; Chaudri Ross, Hilary A; Niedzwiecki, Donna; Fyfe, Gwendolyn A; Gough, Jacqueline M; Bushnell, William D; Stephens, Cynthia L; Mace, M Kelsey; Abrams, Jeffrey S; Schilsky, Richard L
2010-12-01
Although much is known about the safety of an anticancer agent at the time of initial marketing approval, sponsors customarily collect comprehensive safety data for studies that support supplemental indications. This adds significant cost and complexity to the study but may not provide useful new information. The main purpose of this analysis was to assess the amount of safety and concomitant medication data collected to determine a more optimal approach in the collection of these data when used in support of supplemental applications. Following a prospectively developed statistical analysis plan, we reanalyzed safety data from eight previously completed prospective randomized trials. A total of 107,884 adverse events and 136,608 concomitant medication records were reviewed for the analysis. Of these, four grade 1 to 2 and nine grade 3 and higher events were identified as drug effects that were not included in the previously established safety profiles and could potentially have been missed using subsampling. These events were frequently detected in subsamples of 400 patients or larger. Furthermore, none of the concomitant medication records contributed to labeling changes for the supplemental indications. Our study found that applying the optimized methodologic approach, described herein, has a high probability of detecting new drug safety signals. Focusing data collection on signals that cause physicians to modify or discontinue treatment ensures that safety issues of the highest concern for patients and regulators are captured and has significant potential to relieve strain on the clinical trials system.
NASA Astrophysics Data System (ADS)
Steiner, Zvi; Lazar, Boaz; Levi, Shani; Tsroya, Shimon; Pelled, Omer; Bookman, Revital; Erez, Jonathan
2016-12-01
Studies of recent environmental perturbations often rely on data derived from marine sedimentary records. These records are known to imperfectly inscribe the true sequence of events, yet there is large uncertainty regarding the corrections that should be employed to accurately describe the sedimentary history. Here we show in recent records from the Gulf of Aqaba, Red Sea, how events of the abrupt disappearance of the planktonic foraminifer Globigerinoides sacculifer, and episodic deposition of the artificial radionuclide 137Cs, are significantly altered in the sedimentary record compared to their known past timing. Instead of the abrupt disappearance of the foraminifera, we observe a prolonged decline beginning at core depth equivalent to ∼30 y prior to its actual disappearance and continuing for decades past the event. We further observe asymmetric smoothing of the radionuclide peak. Utilization of advection-diffusion-reaction models to reconstruct the original fluxes based on the known absolute timing of the events reveal that it is imperative to use a continuous function to describe bioturbation. Discretization of bioturbation into mixed and unmixed layers significantly shifts the location of the modeled event. When bioturbation is described as a continuously decreasing function of depth, the peak of a very short term event smears asymmetrically but remains in the right depth. When sudden events repeat while the first spike is still mixed with the upper sediment layer, bioturbation unifies adjacent peaks. The united peak appears at an intermediate depth that does not necessarily correlate with the timing of the individual events. In a third case, a long lasting sedimentary event affected by bioturbation, the resulting peak is rather weak compared to the actual event and appears deeper in the sediment column than expected based on the termination of the event. The model clearly shows that abrupt changes can only endure in the record if a thick sediment layer settled on the sediment-water interface at once or if bioturbation rates decreased to very low values for a prolonged period of time. In any other case smearing by bioturbation makes an abrupt event appear to have started shortly before the real timing and end long after its true termination.
NASA Astrophysics Data System (ADS)
Luciani, Valeria; Giusberti, Luca; Agnini, Claudia; Fornaciari, Eliana; Rio, Domenico
2010-05-01
The early Paleogene is one of the more climatically and evolutionary dynamic periods in the Earth history that records a pronounced warming trend peaking in the Early Eocene, and a successive composite transition towards the modern icehouse world. Ever increasingly scientific attention is dedicated to definitely comprehend timing, nature and characters of the complex, non-linear evolution of the Paleogene climate. Several complete and expanded Paleogene successions (Forada, Possagno, Alano, Farra), with a sound magneto-biochronostratigraphic and stable isotope record crop out in the Venetian Southern Alps (Northeast Italy). Recent studies (Giusberti et. al., 2007; Luciani et al., 2007; Agnini et al., 2008) and unpublished data document the presence in these section of the main short-lived warming events (hyperthermals) of the Eocene (Paleocene-Eocene Thermal Maximum, PETM, ca 55 Ma, Eocene Layer of Mysterious Origin (ELMO, ca 53,6 Ma), X-event (ca 52.5 Ma), of the Early Eocene Climatic Optimum (EECO, ca 50-52 Ma) and of the Middle Eocene Climatic Optimum (MECO, ca 40 Ma; Zachos et al., 2001. 2008). All these events are typified by marked negative shifts in δ13C curves that correspond to carbonate decrease related to rise of the carbonate compensation depth in turn induced by large introduction in the ocean-atmosphere system of CO2. Common features to the warming events are pronounced and complex changes in planktonic foraminiferal assemblages, indicating strong environmental perturbations that perfectly parallel the variations of the stable isotope curves in all the examined events. These strict correspondences indicate close cause-effect relationships between changes in environmental conditions and modifications of the assemblages. Our analysis shows that the most striking variations are recorded by the PETM and MECO assemblages that reflect highly perturbed environments. The ELMO, X-event and EECO exhibit planktic foraminiferal responses that are similar to, though less intense than, those observed across the PETM and the MECO. In addition, sedimentological and quantitative micropaleontological data from the hyperthermal events from the Venetian Southern Alps essentially suggest as the main response to the pronounced warmth, increased weathering and runoff as well as sea surface eutrophication. A pronounced shift from relatively oligotrophic to eutrophic, opportunist planktonic foraminiferal assemblages was observed at the MECO as well, thus showing analogies with the hyperthermal events recorded in the same area. The taxa indicating eutrophic environmental conditions are however different at the MECO from the Alano section; on the other hand we can expect that the planktonic foraminiferal taxa indicating analogous scenarios might be different in different Eocene time-intervals. Remarkably, the PETM and MECO events record a significant occurrence of giant and malformed foraminifera, evidence of transient alteration in the ocean chemistry, including possible pH oscillations and increase in trace metal content. Our data suggests therefore that a major threshold in the photic zone ocean chemistry has been passed only for those prominent events. In conclusion, from the biotic response to the hyperthermal events, to the EECO and MECO we deduce that the most important effect of pronounced warming, that is the aspect common to all these events, has been the eutrophication of surface waters, as a consequence of modification in the hydrological cycle. The location adjacent to land masses of the studied Tethyan setting evidently facilitated the terrigenous input that was apparently the main responsible for the increase in nutrient availability during the cited Paleogene warming events. Finally, several lines of evidence indicate that PETM, EECO and MECO were linked to permanent changes in planktonic foraminiferal evolution beside the transient, ecologically controlled variations. Even though the true mechanisms forcing evolution of life on Earth are still unexplained, our record of the major climatic Paleogene events suggests a close interaction between global climate and biological evolution. REFERENCES Agnini et al., 2008. Rend. Soc. Geol. It. 4, 5-12. Giusberti et., 2007; Geol. Soc. Am. Bull. 119, 391-412. Luciani et al., 2007. Mar. Micopaleont. 64, 189-214. Zachos et al., 2001. Science 292, 686-693. Zachos et al 2008 Nature 451, 279-283.
A System and Method for Online High-Resolution Mapping of Gastric Slow-Wave Activity
Bull, Simon H.; O’Grady, Gregory; Du, Peng
2015-01-01
High-resolution (HR) mapping employs multielectrode arrays to achieve spatially detailed analyses of propagating bioelectrical events. A major current limitation is that spatial analyses must currently be performed “off-line” (after experiments), compromising timely recording feedback and restricting experimental interventions. These problems motivated development of a system and method for “online” HR mapping. HR gastric recordings were acquired and streamed to a novel software client. Algorithms were devised to filter data, identify slow-wave events, eliminate corrupt channels, and cluster activation events. A graphical user interface animated data and plotted electrograms and maps. Results were compared against off-line methods. The online system analyzed 256-channel serosal recordings with no unexpected system terminations with a mean delay 18 s. Activation time marking sensitivity was 0.92; positive predictive value was 0.93. Abnormal slow-wave patterns including conduction blocks, ectopic pacemaking, and colliding wave fronts were reliably identified. Compared to traditional analysis methods, online mapping had comparable results with equivalent coverage of 90% of electrodes, average RMS errors of less than 1 s, and CC of activation maps of 0.99. Accurate slow-wave mapping was achieved in near real-time, enabling monitoring of recording quality and experimental interventions targeted to dysrhythmic onset. This work also advances the translation of HR mapping toward real-time clinical application. PMID:24860024
NASA Astrophysics Data System (ADS)
Kremer, Katrina; Reusch, Anna; Wirth, Stefanie B.; Anselmetti, Flavio S.; Girardclos, Stéphanie; Strasser, Michael
2016-04-01
Intraplate settings are characterized by low deformation rates and recurrence intervals of strong earthquakes that often exceed the time span covered by instrumental records. Switzerland, as an example for such settings, shows a low instrumentally recorded seismicity, in contrast to strong earthquakes (e.g. 1356 Basel earthquake, Mw=6.6 and 1601 Unterwalden earthquake, Mw=5.9) mentioned in the historical archives. As such long recurrence rates do not allow for instrumental identification of earthquake sources of these strong events, and as intense geomorphologic alterations prevent preservation of surface expressions of faults, the knowledge of active faults is very limited. Lake sediments are sensitive to seismic shaking and thus, can be used to extend the regional earthquake catalogue if the sedimentary deposits or deformation structures can be linked to an earthquake. Single lake records allow estimating local intensities of shaking while multiple lake records can furthermore be used to compare temporal and spatial distribution of earthquakes. In this study, we compile a large dataset of dated sedimentary event deposits recorded in Swiss lakes available from peer-reviewed publications and unpublished master theses. We combine these data in order to detect large prehistoric regional earthquake events or periods of intense shaking that might have affected multiple lake settings. In a second step, using empirical seismic attenuation equations, we test if lake records can be used to reconstruct magnitudes and epicentres of identified earthquakes.
Real-Time River Channel-Bed Monitoring at the Chariton and Mississippi Rivers in Missouri, 2007-09
Rydlund, Jr., Paul H.
2009-01-01
Scour and depositional responses to hydrologic events have been important to the scientific community studying sediment transport as well as potential effects on bridges and other hydraulic structures within riverine systems. A river channel-bed monitor composed of a single-beam transducer was installed on a bridge crossing the Chariton River near Prairie Hill, Missouri (structure L-344) as a pilot study to evaluate channel-bed change in response to the hydrologic condition disseminated from an existing streamgage. Initial results at this location led to additional installations in cooperation with the Missouri Department of Transportation at an upstream Chariton River streamgage location at Novinger, Missouri (structure L-534) and a Mississippi River streamgage location near Mehlville, Missouri (structures A-1850 and A-4936). In addition to stage, channel-bed elevation was collected at all locations every 15 minutes and transmitted hourly to a U.S. Geological Survey database. Bed elevation data for the Chariton River location at Novinger and the Mississippi River location near Mehlville were provided to the World Wide Web for real-time monitoring. Channel-bed data from the three locations indicated responses to hydrologic events depicted in the stage record; however, notable bedforms apparent during inter-event flows also may have affected the relation of scour and deposition to known hydrologic events. Throughout data collection periods, Chariton River locations near Prairie Hill and Novinger reflected bed changes as much as 13 feet and 5 feet. Nearly all of the bed changes correlated well with the hydrographic record at these locations. The location at the Mississippi River near Mehlville indicated a much more stable channel bed throughout the data collection period. Despite missing data resulting from damage to one of the river channel-bed monitors from ice accumulation at the upstream nose of the bridge pier early in the record, the record from the downstream river channel-bed monitor demonstrated a good correlation (regardless of a 7 percent high bias) between bedform movement and the presence of bedforms surrounding the bridge as indicated by coincident bathymetric surveys using multibeam sonar.
Changes in record-breaking temperature events in China and projections for the future
NASA Astrophysics Data System (ADS)
Deng, Hanqing; Liu, Chun; Lu, Yanyu; He, Dongyan; Tian, Hong
2017-06-01
As global warming intensifies, more record-breaking (RB) temperature events are reported in many places around the world where temperatures are higher than ever before http://cn.bing.com/dict/search?q=.&FORM=BDVSP6&mkt=zh-cn. The RB temperatures have caused severe impacts on ecosystems and human society. Here, we address changes in RB temperature events occurring over China in the past (1961-2014) as well as future projections (2006-2100) using observational data and the newly available simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5). The number of RB events has a significant multi-decadal variability in China, and the intensity expresses a strong decrease from 1961 to 2014. However, more frequent RB events occurred in mid-eastern and northeastern China over last 30 years (1981-2010). Comparisons with observational data indicate multi-model ensemble (MME) simulations from the CMIP5 model perform well in simulating RB events for the historical run period (1961-2005). CMIP5 MME shows a relatively larger uncertainty for the change in intensity. From 2051 to 2100, fewer RB events are projected to occur in most parts of China according to RCP 2.6 scenarios. Over the longer period from 2006 to 2100, a remarkable increase is expected for the entire country according to RCP 8.5 scenarios and the maximum numbers of RB events increase by approximately 600 per year at end of twenty-first century.
NASA Astrophysics Data System (ADS)
Popp, T. J.; Svensson, A.; Steffensen, J. P.; Johnsen, S. J.; White, J. W. C.
2009-04-01
Isotopic and chemical impurity records from Greenland ice cores with sub-annual resolution across three fast climate transitions of the last deglacial termination reveal complex patterns of environmental change for the onset of Greenland Interstadial 1 (GI-1 or Bølling), the onset of Greenland Stadial 1 (GS-1 or Younger Dryas), and the onset of the Holocene. In the NGRIP ice core each of these transitions is initiated by a 1-3 year mode shift in deuterium excess, which is a proxy for the Greenland precipitation moisture source. These mode shifts in deuterium excess are decoupled in time from the isotopic (deuterium and oxygen-18) transitions from which they are derived. In general the abrupt isotopic transitions follow the corresponding deuterium excess shifts and span decades rather than years. Similar data from GISP2 confirms the clear deuterium excess mode shifts for transitions from cold states to warm states; however the abrupt deuterium excess transition at the onset of GS-1 is not expressed in a similar way at GISP2. Ironically, it appears that this cooling at the beginning of the Younger Dryas, for which we have theories of the triggering event, is less clearly recorded than warming events, the triggering of which is still poorly understood. Along with other available paleo-data, these results indicate that the sum of an abrupt climate change is composed of multiple responses from different parts of the climate system. These responses can be separated by as little as a single year to a few decades and the collection of these responses result in a variety of abrupt transitions giving each a unique anatomy. Here we expand this type of analysis with new isotope, deuterium excess, and accumulation rate time series from NGRIP across the abrupt transitions associated with several interstadial events of the Last Glacial period (Dansgaard-Oeschger events). Indeed the temporal phasing of deuterium excess and the isotopic content of the ice can vary from one event to the next and emerging patterns may depend on the conditions associated with specific events such as Heinrich Events and ice volume boundary conditions. Together with modeling and chemical impurity data, these patterns will provide clues to the timing and origin of ocean and atmospheric changes that comprise an abrupt climate change. The emerging picture indicates that abrupt climate changes have both a temporal and geographic anatomy that can change from one event to the next in how they are recorded across Greenland.
NASA Astrophysics Data System (ADS)
Parker, Andrew O.; Schmidt, Matthew W.; Chang, Ping
2015-11-01
The role of Atlantic Meridional Overturning Circulation (AMOC) as the driver of Dansgaard-Oeschger (DO) variability that characterized Marine Isotope Stage 3 (MIS 3) has long been hypothesized. Although there is ample proxy evidence suggesting that DO events were robust features of glacial climate, there is little data supporting a link with AMOC. Recently, modeling studies and subsurface temperature reconstructions have suggested that subsurface warming across the tropical North Atlantic can be used to fingerprint a weakened AMOC during the deglacial because a reduction in the strength of the western boundary current allows warm salinity maximum water of the subtropical gyre to enter the deep tropics. To determine if AMOC variability played a role during the DO cycles of MIS 3, we present new, high-resolution Mg/Ca and δ18O records spanning 24-52 kyr from the near-surface dwelling planktonic foraminifera Globigerinoides ruber and the lower thermocline dwelling planktonic foraminifera Globorotalia truncatulinoides in Southern Caribbean core VM12-107 (11.33°N, 66.63°W, 1079 m depth). Our subsurface Mg/Ca record reveals abrupt increases in Mg/Ca ratios (the largest equal to a 4°C warming) during the interstadial-stadial transition of most DO events during this period. This change is consistent with reconstructions of subsurface warming events associated with cold events across the deglacial using the same core. Additionally, our data support the conclusion reached by a recently published study from the Florida Straits that AMOC did not undergo significant reductions during Heinrich events 2 and 3. This record presents some of the first high-resolution marine sediment derived evidence for variable AMOC during MIS 3.
Characterising Record Flooding in the United Kingdom
NASA Astrophysics Data System (ADS)
Cox, A.; Bates, P. D.; Smith, J. A.
2017-12-01
Though the most notable floods in history have been carefully explained, there remains a lack of literature that explores the nature of record floods as a whole in the United Kingdom. We characterise the seasonality, statistical and spatial distribution, and meteorological causes of peak river flows for 521 gauging stations spread across the British Isles. We use annual maximum data from the National River Flow Archive, catchment descriptors from the Flood Estimation Handbook, and historical records of large floods. What we aim to find is in what ways, if any, the record flood for a station is different from more 'typical' floods. For each station, we calculate two indices: the seasonal anomaly and the flood index. Broadly, the seasonal anomaly is the degree to which a station's record flood happens at a different time of year compared to typical floods at that site, whilst the flood index is a station's record flood discharge divided by the discharge of the 1-in-10-year return period event. We find that while annual maximum peaks are dominated by winter frontal rainfall, record floods are disproportionately caused by summer convective rainfall. This analysis also shows that the larger the seasonal anomaly, the higher the flood index. Additionally, stations across the country have record floods that occur in the summer with no notable spatial pattern, yet the most seasonally anomalous record events are concentrated around the south and west of the British Isles. Catchment descriptors tell us little about the flood index at a particular station, but generally areas with lower mean annual precipitation have a higher flood index. The inclusion of case studies from recent and historical examples of notable floods across the UK supplements our analysis and gives insight into how typical these events are, both statistically and meteorologically. Ultimately, record floods in general happen at relatively unexpected times and with unpredictable magnitudes, which is a worrying reality for those who live in flood-prone areas, and to those who study the upper tail of flood events.
Automated Processing Workflow for Ambient Seismic Recordings
NASA Astrophysics Data System (ADS)
Girard, A. J.; Shragge, J.
2017-12-01
Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that automated preprocessing of ambient seismic recordings in the recording domain successfully mitigates unwanted coherent noise events in both the time and frequency domain. Accordingly, we assert that this method is beneficial for direct wave-equation imaging with ambient seismic recordings.
NASA Astrophysics Data System (ADS)
Bianchi, R. M.; Boudreau, J.; Konstantinidis, N.; Martyniuk, A. C.; Moyse, E.; Thomas, J.; Waugh, B. M.; Yallup, D. P.; ATLAS Collaboration
2017-10-01
At the beginning, HEP experiments made use of photographical images both to record and store experimental data and to illustrate their findings. Then the experiments evolved and needed to find ways to visualize their data. With the availability of computer graphics, software packages to display event data and the detector geometry started to be developed. Here, an overview of the usage of event display tools in HEP is presented. Then the case of the ATLAS experiment is considered in more detail and two widely used event display packages are presented, Atlantis and VP1, focusing on the software technologies they employ, as well as their strengths, differences and their usage in the experiment: from physics analysis to detector development, and from online monitoring to outreach and communication. Towards the end, the other ATLAS visualization tools will be briefly presented as well. Future development plans and improvements in the ATLAS event display packages will also be discussed.
NASA Astrophysics Data System (ADS)
Butler, Paul; Estrella-Martínez, Juan; Scourse, James
2017-04-01
The so-called 8.2K cold event is a rapid cooling of about 6° +/- 2° recorded in the Greenland ice core record and thought to be a consequence of a freshwater pulse from the Laurentide ice sheet which reduced deepwater formation in the North Atlantic. In the Greenland ice cores the event is characterized by a maximum extent of 159 years and a central event lasting for 70 years. As discussed by Thomas et al (QSR, 2007), the low resolution and dating uncertainty of much palaeoclimate data makes it difficult to determine the rates of change and causal sequence that characterise the event at different locations. We present here a bivalve shell chronology based on four shells of Arctica islandica from the northern North Sea which (within radiocarbon uncertainty) is coeval with the 8.2K event recorded in the Greenland ice cores. The years of death of each shell based on radiocarbon analysis and crossmatching are 8094, 8134, 8147, and 8208 yrs BP (where "present" = AD 1950), with an associated radiocarbon uncertainty of +/-80 yrs, and their longevities are 106, 122, 112 and 79 years respectively. The total length of the chronology is 192 years (8286 - 8094 BP +/- 80 yrs). The most noticeable feature of the chronology is an 60-year period of increasing growth which may correspond to a similar period of decreasing ice accumulation in the GRIP (central Greenland) ice core record. We tentatively suggest that this reflects increasing food supply to the benthos as summer stratification is weakened by colder seawater temperatures. Stable isotope analyses (results expected to be available when this abstract is presented), will show changes at annual and seasonal resolution, potentially giving a very detailed insight into the causal factors associated with the 8.2K event and its impact in the northern North Sea.
Nazarzadeh, Kimia; Arjunan, Sridhar P; Kumar, Dinesh K; Das, Debi Prasad
2016-08-01
In this study, we have analyzed the accelerometer data recorded during gait analysis of Parkinson disease patients for detecting freezing of gait (FOG) episodes. The proposed method filters the recordings for noise reduction of the leg movement changes and computes the wavelet coefficients to detect FOG events. Publicly available FOG database was used and the technique was evaluated using receiver operating characteristic (ROC) analysis. Results show a higher performance of the wavelet feature in discrimination of the FOG events from the background activity when compared with the existing technique.
NASA Astrophysics Data System (ADS)
Alberico, I.; Giliberti, I.; Insinga, D. D.; Petrosino, P.; Vallefuoco, M.; Lirer, F.; Bonomo, S.; Cascella, A.; Anzalone, E.; Barra, R.; Marsella, E.; Ferraro, L.
2017-06-01
Paleoclimatic data are essential for fingerprinting the climate of the earth before the advent of modern recording instruments. They enable us to recognize past climatic events and predict future trends. Within this framework, a conceptual and logical model was drawn to physically implement a paleoclimatic database named WDB-Paleo that includes the paleoclimatic proxies data of marine sediment cores of the Mediterranean Basin. Twenty entities were defined to record four main categories of data: a) the features of oceanographic cruises and cores (metadata); b) the presence/absence of paleoclimatic proxies pulled from about 200 scientific papers; c) the quantitative analysis of planktonic and benthonic foraminifera, pollen, calcareous nannoplankton, magnetic susceptibility, stable isotopes, radionuclides values of about 14 cores recovered by Institute for Coastal Marine Environment (IAMC) of Italian National Research Council (CNR) in the framework of several past research projects; d) specific entities recording quantitative data on δ18O, AMS 14C (Accelerator Mass Spectrometry) and tephra layers available in scientific papers. Published data concerning paleoclimatic proxies in the Mediterranean Basin are recorded only for 400 out of 6000 cores retrieved in the area and they show a very irregular geographical distribution. Moreover, the data availability decreases when a constrained time interval is investigated or more than one proxy is required. We present three applications of WDB-Paleo for the Younger Dryas (YD) paleoclimatic event at Mediterranean scale and point out the potentiality of this tool for integrated stratigraphy studies.
Earthquake Monitoring with the MyShake Global Smartphone Seismic Network
NASA Astrophysics Data System (ADS)
Inbal, A.; Kong, Q.; Allen, R. M.; Savran, W. H.
2017-12-01
Smartphone arrays have the potential for significantly improving seismic monitoring in sparsely instrumented urban areas. This approach benefits from the dense spatial coverage of users, as well as from communication and computational capabilities built into smartphones, which facilitate big seismic data transfer and analysis. Advantages in data acquisition with smartphones trade-off with factors such as the low-quality sensors installed in phones, high noise levels, and strong network heterogeneity, all of which limit effective seismic monitoring. Here we utilize network and array-processing schemes to asses event detectability with the MyShake global smartphone network. We examine the benefits of using this network in either triggered or continuous modes of operation. A global database of ground motions measured on stationary phones triggered by M2-6 events is used to establish detection probabilities. We find that the probability of detecting an M=3 event with a single phone located <10 km from the epicenter exceeds 70%. Due to the sensor's self-noise, smaller magnitude events at short epicentral distances are very difficult to detect. To increase the signal-to-noise ratio, we employ array back-projection techniques on continuous data recorded by thousands of phones. In this class of methods, the array is used as a spatial filter that suppresses signals emitted from shallow noise sources. Filtered traces are stacked to further enhance seismic signals from deep sources. We benchmark our technique against traditional location algorithms using recordings from California, a region with large MyShake user database. We find that locations derived from back-projection images of M 3 events recorded by >20 nearby phones closely match the regional catalog locations. We use simulated broadband seismic data to examine how location uncertainties vary with user distribution and noise levels. To this end, we have developed an empirical noise model for the metropolitan Los-Angeles (LA) area. We find that densities larger than 100 stationary phones/km2 are required to accurately locate M 2 events in the LA basin. Given the projected MyShake user distribution, that condition may be met within the next few years.
CMS Data Processing Workflows during an Extended Cosmic Ray Run
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2009-11-01
The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.
Investigating Holocene Glacial and Pluvials Events in the Sierra Nevada of California
NASA Astrophysics Data System (ADS)
Ashford, J.; Sickman, J. O.; Lucero, D. M.; Kirby, M.; Gray, A. B.
2016-12-01
Understanding interannual and decadal variation in snowfall and extreme hydrologic events in the Sierra Nevada is hampered by short instrumental record and uncertainty caused by extrapolating paleoclimate data from lower elevation systems to the alpine snow deposition zone. Longer paleo records from high elevation systems are necessary to provide a more accurate record of snow water content and extreme precipitation events over millennial timescales that can be used to test hypotheses regarding teleconnections between Pacific climate variability and water supply and flood risk in California. In October 2013 we collected sediment cores from Pear Lake, an alpine lake in Sequoia National Park. The cores were split and characterized by P-wave velocity, magnetic susceptibility and density scanning along with grain-size analysis at 1-2 cm increments. Radiocarbon dates indicate that the Pear Lake cores contain a 13.5K year record of lake sediment. In contrast to other Sierra Nevada lakes previously cored by our group, high-resolution scanning revealed alternating fine grained, light-dark bands (1 mm to 5 mm thick) for most of the Pear Lake core length. This pattern was interrupted at intervals by homogenous clasts (up to 75 mm thick) ranging in grain size from sand to gravel up to 1 cm diameter. The sand to gravel sized clasts are most likely associated with extreme precipitation events. Preliminary grain-size analysis results show evidence of isolated extreme hydrologic events and sections of increased event frequency which we hypothesize are the result of atmospheric rivers intersecting the southern Sierra Nevada outside of the snow covered period.
Video-Stimulated Recall in Cross-Cultural Research in Education: A Case Study in Vietnam
ERIC Educational Resources Information Center
Nguyen, Nga Thanh; Tangen, Donna
2017-01-01
This paper examines incorporating video-stimulated recall (VSR) as a data collection technique in cross-cultural research. With VSR, participants are invited to watch video-recordings of particular events that they are involved in; they then recall their thoughts in relation to their observations of their behaviour in relation to the event. The…
Sibling Negotiations and the Construction of Literacy Events in an Urban Area of Tanzania
ERIC Educational Resources Information Center
Frankenberg, Sofia Johnson; Holmqvist, Rolf; Rubenson, Birgitta; Rindstedt, Camilla
2012-01-01
This study presents findings from analyses of naturally occurring literacy events, where children jointly focus on reading and writing letters of the alphabet, illustrating social constructions of learning created through language and embodied action. Video recorded data from two different families living in an urban low-income area in Tanzania is…
Detection of explosive cough events in audio recordings by internal sound analysis.
Rocha, B M; Mendes, L; Couceiro, R; Henriques, J; Carvalho, P; Paiva, R P
2017-07-01
We present a new method for the discrimination of explosive cough events, which is based on a combination of spectral content descriptors and pitch-related features. After the removal of near-silent segments, a vector of event boundaries is obtained and a proposed set of 9 features is extracted for each event. Two data sets, recorded using electronic stethoscopes and comprising a total of 46 healthy subjects and 13 patients, were employed to evaluate the method. The proposed feature set is compared to three other sets of descriptors: a baseline, a combination of both sets, and an automatic selection of the best 10 features from both sets. The combined feature set yields good results on the cross-validated database, attaining a sensitivity of 92.3±2.3% and a specificity of 84.7±3.3%. Besides, this feature set seems to generalize well when it is trained on a small data set of patients, with a variety of respiratory and cardiovascular diseases, and tested on a bigger data set of mostly healthy subjects: a sensitivity of 93.4% and a specificity of 83.4% are achieved in those conditions. These results demonstrate that complementing the proposed feature set with a baseline set is a promising approach.
Analysis and visualization of single-trial event-related potentials
NASA Technical Reports Server (NTRS)
Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.
2001-01-01
In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image visualization to the analysis of sets of single trials from event-related EEG (or MEG) experiments can increase the information available from ERP (or ERF) data. Copyright 2001 Wiley-Liss, Inc.
Development of a database and processing method for detecting hematotoxicity adverse drug events.
Shimai, Yoshie; Takeda, Toshihiro; Manabe, Shirou; Teramoto, Kei; Mihara, Naoki; Matsumura, Yasushi
2015-01-01
Adverse events are detected by monitoring the patient's status, including blood test results. However, it is difficult to identify all adverse events based on recognition by individual doctors. We developed a system that can be used to detect hematotoxicity adverse events according to blood test results recorded in an electronic medical record system. The blood test results were graded based on Common Terminology Criteria for Adverse Events (CTCAE) and changes in the blood test results (Up, Down, Flat) were assessed according to the variation in the grade. The changes in the blood test and injection data were stored in a database. By comparing the date of injection and start and end dates of the change in the blood test results, adverse events related to a designated drug were detected. Using this method, we searched for the occurrence of serious adverse events (CTCAE Grades 3 or 4) concerning WBC, ALT and creatinine related to paclitaxel at Osaka University Hospital. The rate of occurrence of a decreased WBC count, increased ALT level and increased creatinine level was 36.0%, 0.6% and 0.4%, respectively. This method is useful for detecting and estimating the rate of occurrence of hematotoxicity adverse drug events.
Lake deposits record evidence of large post-1505 AD earthquakes in western Nepal
NASA Astrophysics Data System (ADS)
Ghazoui, Z.; Bertrand, S.; Vanneste, K.; Yokoyama, Y.; Van Der Beek, P.; Nomade, J.; Gajurel, A.
2016-12-01
According to historical records, the last large earthquake that ruptured the Main Frontal Thrust (MFT) in western Nepal occurred in 1505 AD. Since then, no evidence of other large earthquakes has been found in historical records or geological archives. In view of the catastrophic consequences to millions of inhabitants of Nepal and northern India, intense efforts currently focus on improving our understanding of past earthquake activity and complement the historical data on Himalayan earthquakes. Here we report a new record, based on earthquake-triggered turbidites in lakes. We use lake sediment records from Lake Rara, western Nepal, to reconstruct the occurrence of seismic events. The sediment cores were studied using a multi-proxy approach combining radiocarbon and 210Pb chronologies, physical properties (X-ray computerized axial tomography scan, Geotek multi-sensor core logger), high-resolution grain size, inorganic geochemistry (major elements by ITRAX XRF core scanning) and bulk organic geochemistry (C, N concentrations and stable isotopes). We identified several sequences of dense and layered fine sand mainly composed of mica, which we interpret as earthquake-triggered turbidites. Our results suggest the presence of a synchronous event between the two lake sites correlated with the well-known 1505 AD earthquake. In addition, our sediment records reveal five earthquake-triggered turbidites younger than the 1505 AD event. By comparison with historical archives, we relate one of those to the 1833 AD MFT rupture. The others may reflect successive ruptures of the Western Nepal Fault System. Our study sheds light on events that have not been recorded in historical chronicles. Those five MMI>7 earthquakes permit addressing the problem of missing slip on the MFT in western Nepal and reevaluating the risk of a large earthquake affecting western Nepal and North India.
NASA Astrophysics Data System (ADS)
O'Connor, G.; Cobb, K. M.; Sayani, H. R.; Grothe, P. R.; Atwood, A. R.; Stevenson, S.; Hitt, N. T.; Lynch-Stieglitz, J.
2016-12-01
The El Niño/Southern Oscillation (ENSO) of 2015/2016 was a record-breaking event in the central Pacific, driving profound changes in the properties of the ocean and atmosphere. Prolonged ocean warming of up to 3°C translated into a large-scale coral bleaching and mortality event on Christmas Island (2°N, 157°W) that very few individuals escaped unscathed. As part of a long-term, interdisciplinary monitoring effort underway since August 2014, we present results documenting the timing and magnitude of environmental changes on the Christmas Island reefs. In particular, we present the first coral geochemical time series spanning the last several years, using cores that were drilled from rare living coral colonies during a field expedition in April 2016, at the tail end of the event. These geochemical indicators are sensitive to both ocean temperature, salinity, and water mass properties and have been used to quantitatively reconstruct ENSO extremes of the recent [Nurhati et al., 2011] and distant [Cobb et al., 2013] past. By analyzing multiple cores from both open ocean and lagoonal settings, we are able to undertake a quantitative comparison of this event with past very strong El Niño events contained in the coral archive - including the 1940/41, 1972/73, and 1997/98 events. For the most recent event, we compare our coral geochemistry records with a rich suite of in situ environmental data, including physical and geochemical parameters collected as part of the NOAA rapid response campaign in the central tropical Pacific. This unique dataset not only provides physical context interpreting coral geochemical records from the central tropical Pacific, but allows us to assess why the 2015/2016 El Niño event was so devastating to coral reef ecosystems in this region.
NASA Technical Reports Server (NTRS)
Levine, D. M.
1981-01-01
Ground-based data collected on lightning monitoring equipment operated by Goddard Space Flight Center at Wallops Island, Virginia, during a storm being monitored by NASA's F-106B, are presented. The slow electric field change data and RF radiation data were collected at the times the lightning monitoring equipment on the aircraft was triggered. The timing of the ground-based events correlate well with events recorded on the aircraft and provide an indication of the type of flash with which the aircraft was involved.
Study of hadronic event-shape variables in multijet final states in pp collisions at √s = 7 TeV
Khachatryan, V.
2014-10-14
Event-shape variables, which are sensitive to perturbative and nonperturbative aspects of quantum chromodynamic (QCD) interactions, are studied in multijet events recorded in proton-proton collisions at √s = 7 TeV. Events are selected with at least one jet with transverse momentum p T > 110 GeV and pseudorapidity |η| < 2.4, in a data sample corresponding to integrated luminosities of up to 5 fb –1. As a result, the distributions of five event-shape variables in various leading jet p T ranges are compared to predictions from different QCD Monte Carlo event generators.
NASA Astrophysics Data System (ADS)
Garcia, S.; Karplus, M. S.; Farrell, J.; Lin, F. C.; Smith, R. B.
2017-12-01
A large seismic nodal array incorporating 133 three-component, 5-Hz geophones deployed for two weeks in early November 2015 in the Upper Geyser Basin recorded earthquake and hydrothermal activity. The University of Utah, the University of Texas at El Paso, and Yellowstone National Park collaborated to deploy Fairfield Nodal ZLand 3-C geophones concentrically centered around the Old Faithful Geyser with an average station spacing of 50 m and an aperture of 1 km. The array provided a unique dataset to investigate wave propagation through various fractures and active geysers in a hydrothermal field located over the Yellowstone hotspot. The complicated sub-surface features associated with the hydrothermal field appear to impact earthquake wave propagation in the Upper Geyser Basin and to generate seismic signals. Previous work using ambient noise cross-correlation has found an intricately fractured sub-surface that provides pathways for water beneath parts of the Upper Geyser Basin that likely feed Old Faithful and other nearby geysers and hot springs. For this study, we used the data to create visualizations of local earthquake, teleseismic earthquake, and hydrothermal events as they propagate through the array. These ground motion visualizations allow observation of wave propagation through the geyser field, which may indicate the presence of anomalous structure impacting seismic velocities and attenuation. Three teleseismic events were observed in the data, two 6.9MW earthquakes that occurred off the coast of Coquimbo, Colombia 9,000km from the array and one 6.5MW near the Aleutian Islands 4,500km from the array. All three teleseismic events observed in the data exhibited strong direct P-wave arrivals and several additional phases. One local earthquake event (2.5ML) 100km from the Upper Geyser Basin was also well-recorded by the array. Time-domain spectrograms show the dominant frequencies present in the recordings of these events. The two 6.9MW earthquakes in Chile were one hour apart and offered interesting signals that also included a geyser tremor between the two events.
Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras
NASA Astrophysics Data System (ADS)
Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.
2013-12-01
The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB mode (red, green, blue) and compared them with the data provided by the black and white cameras for the same event and the influence of these parameters with the luminosity intensity of the flashes. Two peculiar cases presented, from the data obtained at one site, a stroke, some continuing current during the interval between the strokes and, then, a subsequent stroke; however, the other site showed that the subsequent stroke was in fact an M-component, since the continuing current had not vanished after its parent stroke. These events generated a dubious classification for the same event that was based only in a visual analysis with high-speed cameras and they were analyzed in this work.
Methodology of Historical Flood Evaluation from Korean Historical Documents during AD 1392 to 1910
NASA Astrophysics Data System (ADS)
Cho, H. B.; Kim, H.; Noh, S.; Jang, C.
2007-12-01
Study on extreme flood events has critical limitation of shortage of historical data because modern systematic data don't implement long time series. The historical documentary records hence can be one of the important sources to contribute additional information on extreme flood events which had occurred before the instrumental observations began. For the proper data mining, documentary records satisfying following four conditions are preferred. 1. Long enough time series, 2. Official archives covering over all Korean peninsular, 3. Abundant enough record number, and 4. Detailed damage description. The Annals of Choson Dynasty includes about 500 years and 511 number of flood records during Choson Dynasty in ancient Korea. According to the annals, there were highly dense flood damage records in the middle of 17th century and the largest human damage and residence damage occurred in 1739 and 1856 respectively. Another source is Jeungbo-Munheonbigo. Jeungbo-Munheonbigo is a taxonomic document categorized by the themes such as cultures, social systems, and climates as well as contains 79 number of flood damage records. An effective way to analyze those historical floods without water level data is to classify and categorize the flood damage records because all records are written in descriptive way. Consequently, 556 records are categorized into 10 items by flood damage types and each categorized record is classified into three grades by numerical level that is how much the record is expressed in numerical way. These grouping results are applied to decide reasonable period range to get detailed information from entire inspection period. In addition, Historical Flood Evaluation Index (HFEI) thereby can be derived from the processes in quantitative and statistical ways to evaluate the magnitude of each ancient flood. In this research, flood damage evaluation is mainly focused on the damage of human beings and residences. Also degree ranges based on cumulative probability are induced with two damage inventory. HFEI by conditional weighted factors is applied to every flood record and to analysis for flood distribution in annual series.
Periodicity in marine extinction events
NASA Technical Reports Server (NTRS)
Sepkoski, J. John, Jr.; Raup, David M.
1986-01-01
The periodicity of extinction events is examined in detail. In particular, the temporal distribution of specific, identifiable extinction events is analyzed. The nature and limitations of the data base on the global fossil record is discussed in order to establish limits of resolution in statistical analyses. Peaks in extinction intensity which appear to differ significantly from background levels are considered, and new analyses of the temporal distribution of these peaks are presented. Finally, some possible causes of periodicity and of interdependence among extinction events over the last quarter billion years of earth history are examined.
United States National seismograph network
Masse, R.P.; Filson, J.R.; Murphy, A.
1989-01-01
The USGS National Earthquake Information Center (NEIC) has planned and is developing a broadband digital seismograph network for the United States. The network will consist of approximately 150 seismograph stations distributed across the contiguous 48 states and across Alaska, Hawaii, Puerto Rico and the Virgin Islands. Data transmission will be via two-way satellite telemetry from the network sites to a central recording facility at the NEIC in Golden, Colorado. The design goal for the network is the on-scale recording by at least five well-distributed stations of any seismic event of magnitude 2.5 or greater in all areas of the United States except possibly part of Alaska. All event data from the network will be distributed to the scientific community on compact disc with read-only memory (CD-ROM). ?? 1989.
Concorde noise-induced building vibrations, Sully Plantation - Report no. 2, Chantilly, Virginia
NASA Technical Reports Server (NTRS)
1976-01-01
Noise-induced building vibrations associated with Concorde operations were studied. The approach is to record the levels of induced vibrations and associated indoor/outdoor noise levels in selected homes, historic and other buildings near Dulles International Airport. Representative data are presented which were recorded at Sully Plantation, Chantilly, Virginia during the periods of May 20 through May 28, 1976, and June 14 through June 17, 1976. Recorded data provide relationships between the vibration levels of windows, walls, floors, and the noise associated with Concorde operations, other aircraft, and nonaircraft events. The results presented are drawn from the combined May-June data base which is considerably larger than the May data base covered. The levels of window, wall and floor vibratory response resulting from Concorde operations are higher than the vibratory levels associated with conventional aircraft. Furthermore, the vibratory responses of the windows are considerably higher than those of the walls and floors. The window response is higher for aircraft than recorded nonaircraft events and exhibits a linear response relationship with the overall sound pressure level. For a given sound pressure level, the Concorde may cause more vibration than a conventional aircraft due to spectral or other differences. However, the responses associated with Concorde appear to be much more dependent upon sound pressure level than spectral or other characteristics of the noise.
Magnitude Estimation for the 2011 Tohoku-Oki Earthquake Based on Ground Motion Prediction Equations
NASA Astrophysics Data System (ADS)
Eshaghi, Attieh; Tiampo, Kristy F.; Ghofrani, Hadi; Atkinson, Gail M.
2015-08-01
This study investigates whether real-time strong ground motion data from seismic stations could have been used to provide an accurate estimate of the magnitude of the 2011 Tohoku-Oki earthquake in Japan. Ultimately, such an estimate could be used as input data for a tsunami forecast and would lead to more robust earthquake and tsunami early warning. We collected the strong motion accelerograms recorded by borehole and free-field (surface) Kiban Kyoshin network stations that registered this mega-thrust earthquake in order to perform an off-line test to estimate the magnitude based on ground motion prediction equations (GMPEs). GMPEs for peak ground acceleration and peak ground velocity (PGV) from a previous study by Eshaghi et al. in the Bulletin of the Seismological Society of America 103. (2013) derived using events with moment magnitude ( M) ≥ 5.0, 1998-2010, were used to estimate the magnitude of this event. We developed new GMPEs using a more complete database (1998-2011), which added only 1 year but approximately twice as much data to the initial catalog (including important large events), to improve the determination of attenuation parameters and magnitude scaling. These new GMPEs were used to estimate the magnitude of the Tohoku-Oki event. The estimates obtained were compared with real time magnitude estimates provided by the existing earthquake early warning system in Japan. Unlike the current operational magnitude estimation methods, our method did not saturate and can provide robust estimates of moment magnitude within ~100 s after earthquake onset for both catalogs. It was found that correcting for average shear-wave velocity in the uppermost 30 m () improved the accuracy of magnitude estimates from surface recordings, particularly for magnitude estimates of PGV (Mpgv). The new GMPEs also were used to estimate the magnitude of all earthquakes in the new catalog with at least 20 records. Results show that the magnitude estimate from PGV values using borehole recordings had the smallest standard deviation among the estimated magnitudes and produced more stable and robust magnitude estimates. This suggests that incorporating borehole strong ground-motion records immediately available after the occurrence of large earthquakes can provide robust and accurate magnitude estimation.
Local earthquake interferometry of the IRIS Community Wavefield Experiment, Grant County, Oklahoma
NASA Astrophysics Data System (ADS)
Eddy, A. C.; Harder, S. H.
2017-12-01
The IRIS Community Wavefield Experiment was deployed in Grant County, located in north central Oklahoma, from June 21 to July 27, 2016. Data from all nodes were recorded at 250 samples per second between June 21 and July 20 along three lines. The main line was 12.5 km long oriented east-west and consisted of 129 nodes. The other two lines were 5.5 km long north-south oriented with 49 nodes each. During this time, approximately 150 earthquakes of magnitude 1.0 to 4.4 were recorded in the surrounding counties of Oklahoma and Kansas. Ideally, sources for local earthquake interferometry should be near surface events that produce high frequency body waves. Unlike ambient noise seismic interferometry (ANSI), which uses days, weeks, or even months of continuously recorded seismic data, local earthquake interferometry uses only short segments ( 2 min.) of data. Interferometry in this case is based on the cross-correlation of body wave surface multiples where the event source is translated to a reference station in the array, which acts as a virtual source. Multiples recorded between the reference station and all other stations can be cross-correlated to produce a clear seismic trace. This process will be repeated with every node acting as the reference station for all events. The resulting shot gather will then be processed and analyzed for quality and accuracy. Successful application of local earthquake interferometry will produce a crustal image with identifiable sedimentary and basement reflectors and possibly a Moho reflection. Economically, local earthquake interferometry could lower the time and resource cost of active and passive seismic surveys while improving subsurface image quality in urban settings or areas of limited access. The applications of this method can potentially be expanded with the inclusion of seismic events with a magnitude of 1.0 or lower.
Evidence for non-self-similarity of microearthquakes recorded at a Taiwan borehole seismometer array
NASA Astrophysics Data System (ADS)
Lin, Yen-Yu; Ma, Kuo-Fong; Kanamori, Hiroo; Song, Teh-Ru Alex; Lapusta, Nadia; Tsai, Victor C.
2016-08-01
We investigate the relationship between seismic moment M0 and source duration tw of microearthquakes by using high-quality seismic data recorded with a vertical borehole array installed in central Taiwan. We apply a waveform cross-correlation method to the three-component records and identify several event clusters with high waveform similarity, with event magnitudes ranging from 0.3 to 2.0. Three clusters—Clusters A, B and C—contain 11, 8 and 6 events with similar waveforms, respectively. To determine how M0 scales with tw, we remove path effects by using a path-averaged Q. The results indicate a nearly constant tw for events within each cluster, regardless of M0, with mean values of tw being 0.058, 0.056 and 0.034 s for Clusters A, B and C, respectively. Constant tw, independent of M0, violates the commonly used scaling relation
Modelling Z→TT processes in ATLAS with T-embedded Z →μμ data
Aad, G.
2015-09-15
We describe the concept, technical realisation and validation of a largely data-driven method to model events with Z→ττ decays. In Z→μμ events selected from proton-proton collision data recorded at √s=8 TeV with the ATLAS experiment at the LHC in 2012, the Z decay muons are replaced by τ leptons from simulated Z →ττ decays at the level of reconstructed tracks and calorimeter cells. The τ lepton kinematics are derived from the kinematics of the original muons. Thus, only the well-understood decays of the Z boson and τ leptons as well as the detector response to the τ decay products aremore » obtained from simulation. All other aspects of the event, such as the Z boson and jet kinematics as well as effects from multiple interactions, are given by the actual data. This so-called τ-embedding method is particularly relevant for Higgs boson searches and analyses in ττ final states, where Z→ττ decays constitute a large irreducible background that cannot be obtained directly from data control samples. In this paper, we discuss the relevant concepts based on the implementation used in the ATLAS Standard Model H→ττ analysis of the full datataset recorded during 2011 and 2012.« less
NASA Astrophysics Data System (ADS)
Kanamatsu, T.
2006-12-01
Usefulness of paleointensity records with high-sedimentation rates in stratigraphic correlation have been proved (e.g. Stoner et al., 1998, Laj et al., 2000, Stoner et al., 2000), because the sediment geomagnetic paleointensity data makes possible the fine time correlation between cores on the older sediment than the range of AMS 14C. As father application of the sediment paleointensity for chronological tool, we examined the paleointensity record of much slower sedimentation rate. The paleointensity record of the slower sedimentation sequence is supposed to show the convoluted record by the filtering effect of the post- depositional remanent magnetization, then a unique and different pattern depending on the sedimentation rate (e.g. Guyodo and Channell, 2002). We studied the record of the cores obtained from the West Philippine Sea Basin (Water depth ca. 5000 to 6000 m). The analyses of paleomagnetic direction proved that the cores contain Jaramillo and Olduvai Events. The sedimentation rates of cores estimated from magnetostratigraphy are less than 1cm/kyr (0.6-0.4 cm/kyr). Proxy of paleointensity (NRM20mT/ARM20mT) applied to cores reveals the variations in the records are dominate in c.a. 100 ky cycle. Comparing to other published paleointensity record, it is clear that the record includes ca.100-ky cycle in spite of slower sedimentation rates, although other high frequency records were not identified. It is suggests that geomagnetic events of a few to several kys are recordable in the sediment. The paleointensity in the slow-sedimentation record is still useful for the age control utilizing the lower frequency signal, especially for investigating of less age information sequence such as the deep sea sediment below CCD, but not for fine correlation by high frequency data.
Toward an automated signature recognition toolkit for mission operations
NASA Technical Reports Server (NTRS)
Cleghorn, T.; Laird, P; Perrine, L.; Culbert, C.; Macha, M.; Saul, R.; Hammen, D.; Moebes, T.; Shelton, R.
1994-01-01
Signature recognition is the problem of identifying an event or events from its time series. The generic problem has numerous applications to science and engineering. At NASA's Johnson Space Center, for example, mission control personnel, using electronic displays and strip chart recorders, monitor telemetry data from three-phase electrical buses on the Space Shuttle and maintain records of device activation and deactivation. Since few electrical devices have sensors to indicate their actual status, changes of state are inferred from characteristic current and voltage fluctuations. Controllers recognize these events both by examining the waveform signatures and by listening to audio channels between ground and crew. Recently the authors have developed a prototype system that identifies major electrical events from the telemetry and displays them on a workstation. Eventually the system will be able to identify accurately the signatures of over fifty distinct events in real time, while contending with noise, intermittent loss of signal, overlapping events, and other complications. This system is just one of many possible signature recognition applications in Mission Control. While much of the technology underlying these applications is the same, each application has unique data characteristics, and every control position has its own interface and performance requirements. There is a need, therefore, for CASE tools that can reduce the time to implement a running signature recognition application from months to weeks or days. This paper describes our work to date and our future plans.
Toward an automated signature recognition toolkit for mission operations
NASA Astrophysics Data System (ADS)
Cleghorn, T.; Laird, P.; Perrine, L.; Culbert, C.; Macha, M.; Saul, R.; Hammen, D.; Moebes, T.; Shelton, R.
1994-10-01
Signature recognition is the problem of identifying an event or events from its time series. The generic problem has numerous applications to science and engineering. At NASA's Johnson Space Center, for example, mission control personnel, using electronic displays and strip chart recorders, monitor telemetry data from three-phase electrical buses on the Space Shuttle and maintain records of device activation and deactivation. Since few electrical devices have sensors to indicate their actual status, changes of state are inferred from characteristic current and voltage fluctuations. Controllers recognize these events both by examining the waveform signatures and by listening to audio channels between ground and crew. Recently the authors have developed a prototype system that identifies major electrical events from the telemetry and displays them on a workstation. Eventually the system will be able to identify accurately the signatures of over fifty distinct events in real time, while contending with noise, intermittent loss of signal, overlapping events, and other complications. This system is just one of many possible signature recognition applications in Mission Control. While much of the technology underlying these applications is the same, each application has unique data characteristics, and every control position has its own interface and performance requirements. There is a need, therefore, for CASE tools that can reduce the time to implement a running signature recognition application from months to weeks or days. This paper describes our work to date and our future plans.
[Validation of an adverse event reporting system in primary care].
de Lourdes Rojas-Armadillo, María; Jiménez-Báez, María Valeria; Chávez-Hernández, María Margarita; González-Fondón, Araceli
2016-01-01
Patient safety is a priority issue in health systems, due to the damage costs, institutional weakening, lack of credibility, and frustration on those who committed an error that resulted in an adverse event. There is no standardized instrument for recording, reporting, and analyzing sentinel or adverse events (AE) in primary care. Our aim was to design and validate a surveillance system for recording sentinel events, adverse events and near miss incidents in primary care. We made a review of systems for recording and reporting adverse events in primary care. Then, we proposed an instrument to record these events, and register faults in the structure and process, in primary health care units in the Instituto Mexicano del Seguro Social. We showed VENCER-MF format to 35 subjects. Out of them, 100% identified a failure in care process, 90% recorded a sentinel event, 85% identified the cause of this event, 75% of them suggested some measures for avoiding the recurrence of adverse events. We used a Cronbach's alpha of 0.6, p=0.03. The instrument VENCER-MF has a good consistency for the identification of adverse events.
NASA Astrophysics Data System (ADS)
Carolin, S.; Walker, R. T.; Henderson, G. M.; Maxfield, L.; Ersek, V.; Sloan, A.; Talebian, M.; Fattahi, M.; Nezamdoust, J.
2015-12-01
The influence of climate on the growth and development of ancient civilizations throughout the Holocene remains a topic of heated debate. The 4.2 ka BP global-scale mid-to-low latitude aridification event (Walker et al., 2012) in particular has incited various correlation proposals. Some authors suggest that this event may have led to the collapse of the Akkadian empire in Mesopotamia, one of the first empires in human history, as well as to changes among other Early Bronze Age societies dependent on cereal agriculture (eg. Staubwasser and Weiss, 2006). Other authors remain doubtful of the impact of environmental factors on the collapse of past societies (eg. Middleton, 2012). While coincident timing of an environmental event with archeological evidence does not necessitate a causation, a comprehensive understanding of climate variability in the ancient Near East is nonetheless an essential component to resolving the full history of early human settlements. Paleoclimate data on the Central Iranian Plateau, a region rich with ancient history, is exceptionally sparse compared to other areas. Many karst locations are found throughout the region, however, setting the stage for the development of several high-resolution, accurate and precisely-dated climate proxy records if a correlation between the chemistry of semi-arid speleothem samples and climate is resolved. Here we present a 5.1-3.7 ka BP record of decadal-scale stalagmite stable isotope and trace metal variability. The stalagmite was collected in Gol-e zard cave (35.8oN, 52.0oE), ~100 km NE of Tehran on the southern flank of the Alborz mountain range (2530masl). The area currently receives ~270mm mean annual precipitation, with more than 90% of precipitation falling within the wet season (November-May). We use GNIP data from Tehran and local and regional meteorological data to resolve the large-scale mechanisms forcing isotopic variations in rainwater over Gol-e zard cave. We discuss possible transformation of water isotopes during transition through the karst aquifer based on site properties and simple model experiments. Finally, we discuss the timing and magnitude of significant events in the stable isotope and trace metal records, particularly in relation to the 4.2 ka BP drought event apparent in certain other regional climate records.
Continuous robust sound event classification using time-frequency features and deep learning
Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification. PMID:28892478
Continuous robust sound event classification using time-frequency features and deep learning.
McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.
Random variability explains apparent global clustering of large earthquakes
Michael, A.J.
2011-01-01
The occurrence of 5 Mw ≥ 8.5 earthquakes since 2004 has created a debate over whether or not we are in a global cluster of large earthquakes, temporarily raising risks above long-term levels. I use three classes of statistical tests to determine if the record of M ≥ 7 earthquakes since 1900 can reject a null hypothesis of independent random events with a constant rate plus localized aftershock sequences. The data cannot reject this null hypothesis. Thus, the temporal distribution of large global earthquakes is well-described by a random process, plus localized aftershocks, and apparent clustering is due to random variability. Therefore the risk of future events has not increased, except within ongoing aftershock sequences, and should be estimated from the longest possible record of events.
NASA Technical Reports Server (NTRS)
Finger, Herbert; Weeks, Bill
1985-01-01
This presentation discusses instrumentation that will be used for a specific event, which we hope will carry on to future events within the Space Shuttle program. The experiment is the Autogenic Feedback Training Experiment (AFTE) scheduled for Spacelab 3, currently scheduled to be launched in November, 1984. The objectives of the AFTE are to determine the effectiveness of autogenic feedback in preventing or reducing space adaptation syndrome (SAS), to monitor and record in-flight data from the crew, to determine if prediction criteria for SAS can be established, and, finally, to develop an ambulatory instrument package to mount the crew throughout the mission. The purpose of the Ambulatory Feedback System (AFS) is to record the responses of the subject during a provocative event in space and provide a real-time feedback display to reinforce the training.
NASA Astrophysics Data System (ADS)
Lauterbach, S.; Andersen, N.; Brauer, A.; Erlenkeuser, H.; Danielopol, D. L.; Namiotko, T.; Huels, M.; Belmecheri, S.; Nantke, C.; Meyer, H.; Chapligin, B.; von Grafenstein, U.
2015-12-01
As evidenced by numerous palaeoclimate records worldwide, the Holocene warm period has been interrupted by several short, low-amplitude cold episodes. Among these, the so-called 8.2 ka cold event is the most prominent Holocene climate perturbation but despite extensive studies, knowledge about its synchrony in different areas and particularly about the dynamics of subsequent climate recovery is still limited. As this is of crucial importance for understanding the complex mechanisms that trigger rapid climate fluctuations and for testing the performance of climate models, new data on the 8.2 ka cold event are needed. Here we present a new sub-decadally resolved, precisely dated oxygen isotope (δ18O) record for the interval 7.7-8.7 ka BP obtained from benthic ostracods preserved in the varved lake sediments of Mondsee (Austria), providing new insights into climate development around the 8.2 ka cold event in Central Europe. The new high-resolution δ18O data set reveals the occurrence of a pronounced cold spell around 8.2 ka BP, whose amplitude (~1.0 ‰, equivalent to a 1.5-2.0 °C cooling), total duration (151 a) and absolute dating (8231-8080 a BP, i.e. calendar years before AD 1950) perfectly agree with results from other Northern Hemisphere palaeoclimate archives, e.g. the precisely dated Greenland ice cores. In addition, the Mondsee δ18O record also indicates a 75-year-long air temperature overshoot of ~0.7 °C directly after the 8.2 ka event (between 8080 and 8005 a BP), which is so far only poorly documented in the mid-latitudes. However, this observation is consistent with results from coupled climate models and high-latitude proxy records, thus likely reflecting a hemispheric-scale climate signal driven by enhanced resumption of the Atlantic meridional overturning circulation (AMOC), which apparently also caused synchronous migrations of atmospheric and oceanic front systems in the North Atlantic realm.
Epinephrine syringe exchange events in a paediatric cardiovascular ICU: analysing the storm.
Achuff, Barbara-Jo; Achuff, Jameson C; Park, Hwan H; Moffett, Brady; Acosta, Sebastian; Rusin, Craig G; Checchia, Paul A
2018-03-01
Introduction Haemodynamically unstable patients can experience potentially hazardous changes in vital signs related to the exchange of depleted syringes of epinephrine to full syringes. The purpose was to determine the measured effects of epinephrine syringe exchanges on the magnitude, duration, and frequency of haemodynamic disturbances in the hour after an exchange event (study) relative to the hours before (control). Materials and methods Beat-to-beat vital signs recorded every 2 seconds from bedside monitors for patients admitted to the paediatric cardiovascular ICU of Texas Children's Hospital were collected between 1 January, 2013 and 30 June, 2015. Epinephrine syringe exchanges without dose/flow change were obtained from electronic records. Time, magnitude, and duration of changes in systolic blood pressure and heart rate were characterised using Matlab. Significant haemodynamic events were identified and compared with control data. In all, 1042 syringe exchange events were found and 850 (81.6%) had uncorrupted data for analysis. A total of 744 (87.5%) exchanges had at least 1 associated haemodynamic perturbation including 2958 systolic blood pressure and 1747 heart-rate changes. Heart-rate perturbations occurred 37% before exchange and 63% after exchange, and 37% of systolic blood pressure perturbations happened before syringe exchange, whereas 63% occurred after syringe exchange with significant differences found in systolic blood pressure frequency (p<0.001), duration (p<0.001), and amplitude (p<0.001) compared with control data. This novel data collection and signal processing analysis showed a significant increase in frequency, duration, and magnitude of systolic blood pressure perturbations surrounding epinephrine syringe exchange events.
Recording automotive crash event data
DOT National Transportation Integrated Search
2001-01-01
The National Transportation Safety Board has recommended that automobile manufacturers and the National Highway Traffic Safety Administration work cooperatively to gather information on automotive crashes using on-board collision sensing and recordin...
Bailey, C; Chakravarthy, U; Lotery, A; Menon, G; Talks, J; Bailey, Clare; Kamal, Aintree; Ghanchi, Faruque; Khan, Calderdale; Johnston, Robert; McKibbin, Martin; Varma, Atul; Mustaq, Bushra; Brand, Christopher; Talks, James; Glover,, Nick
2017-01-01
Aims To compare safety outcomes and visual function data acquired in the real-world setting with FAME study results in eyes treated with 0.2 μg/day fluocinolone acetonide (FAc). Methods Fourteen UK clinical sites contributed to pseudoanonymised data collected using the same electronic medical record system. Data pertaining to eyes treated with FAc implant for diabetic macular oedema (DMO) was extracted. Intraocular pressure (IOP)-related adverse events were defined as use of IOP-lowering medication, any rise in IOP>30 mm Hg, or glaucoma surgery. Other measured outcomes included visual acuity, central subfield thickness (CSFT) changes and use of concomitant medications. Results In total, 345 eyes had a mean follow-up of 428 days. Overall, 13.9% of patients required IOP-lowering drops (included initiation, addition and switching of current drops), 7.2% had IOP elevation >30 mm Hg and 0.3% required glaucoma surgery. In patients with prior steroid exposure and no prior IOP-related event, there were no new IOP-related events. In patients without prior steroid use and without prior IOP-related events, 10.3% of eyes required IOP-lowering medication and 4.3% exhibited IOP >30 mm Hg at some point during follow-up. At 24 months, mean best-recorded visual acuity increased from 51.9 to 57.2 letters and 20.8% achieved ≥15-letter improvement. Mean CSFT reduced from 451.2 to 355.5 μm. Conclusions While overall IOP-related emergent events were observed in similar frequency to FAME, no adverse events were seen in the subgroup with prior steroid exposure and no prior IOP events. Efficacy findings confirm that the FAc implant is a useful treatment option for chronic DMO. PMID:28737758
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1996-01-01
The purpose of the propagation studies within the ACTS Project Office is to acquire 20 and 30 GHz rain fade statistics using the ACTS beacon links received at the NGS (NASA Ground Station) in Cleveland. Other than the raw, statistically unprocessed rain fade events that occur in real time, relevant rain fade statistics derived from such events are the cumulative rain fade statistics as well as fade duration statistics (beyond given fade thresholds) over monthly and yearly time intervals. Concurrent with the data logging exercise, monthly maximum rainfall levels recorded at the US Weather Service at Hopkins Airport are appended to the database to facilitate comparison of observed fade statistics with those predicted by the ACTS Rain Attenuation Model. Also, the raw fade data will be in a format, complete with documentation, for use by other investigators who require realistic fade event evolution in time for simulation purposes or further analysis for comparisons with other rain fade prediction models, etc. The raw time series data from the 20 and 30 GHz beacon signals is purged of non relevant data intervals where no rain fading has occurred. All other data intervals which contain rain fade events are archived with the accompanying time stamps. The definition of just what constitutes a rain fade event will be discussed later. The archived data serves two purposes. First, all rain fade event data is recombined into a contiguous data series every month and every year; this will represent an uninterrupted record of the actual (i.e., not statistically processed) temporal evolution of rain fade at 20 and 30 GHz at the location of the NGS. The second purpose of the data in such a format is to enable a statistical analysis of prevailing propagation parameters such as cumulative distributions of attenuation on a monthly and yearly basis as well as fade duration probabilities below given fade thresholds, also on a monthly and yearly basis. In addition, various subsidiary statistics such as attenuation rate probabilities are derived. The purged raw rain fade data as well as the results of the analyzed data will be made available for use by parties in the private sector upon their request. The process which will be followed in this dissemination is outlined in this paper.
Barragan, A A; Workman, J D; Bas, S; Proudfoot, K L; Schuenemann, G M
2016-07-01
The objectives of the present study were to assess (1) the effectiveness of a calving training workshop and an application (app) for touchscreen devices to capture calving-related events, and (2) personnel compliance with calving protocols (time from birth to feeding of first colostrum and time that cows spent in labor). Calving personnel (n=23) from 5 large dairy farms (range: 800-10,000 cows) participated in the study. Participants received training through an on-farm workshop regarding calving management practices and functioning of the app before recording calving-related events. Pre- and posttest evaluations were administered to each participant to measure their knowledge gain and satisfaction with the workshop. Calving personnel recorded calving-related events (n=323) using the app for 7 d following training. Furthermore, the records collected with the app were used to assess missing and incorrect data and calving personnel compliance with calving management protocols (recording time that cows spent in labor and timing of feeding first colostrum to calves). Calving personnel reported that the information provided during the training was relevant (agree=14.3% and strongly agree=85.7%) and of great immediate use (agree=33.3% and strongly agree=66.7%). The presented materials and hands-on demonstrations substantially increased the knowledge level of the attendees (by 23.7 percentage points from pre- to posttest scores). The follow-up assessment with participants revealed that the app was easy to use (91.3%) and that they would continue to use it (100%). Frequency of incorrect (r=0.77) or missing (r=0.76) data was positively correlated with calving:personnel ratio. Furthermore, calving personnel compliance with calving protocols was significantly different within and between herds. These results substantiated the great variation in compliance with calving management protocols within and between dairy farms. Furthermore, the app may serve as a tool to monitor personnel compliance with first feeding of colostrum to calves and their awareness and recognition of amount of time that each cow spent in labor. This would allow decision-makers to adjust, reassign tasks, or plan the management according to actual calving rate to improve the overall quality of data (frequency of incorrect and missing data) and calf welfare (survival and performance). Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Palmieri, L.; Barchielli, A.; Cesana, G.; de Campora, E.; Goldoni, C.A.; Spolaore, P.; Uguccioni, M.; Vancheri, F.; Vanuzzo, D.; Ciccarelli, P.; Giampaoli, S.
2007-01-01
Background The Italian register of cardiovascular diseases is a surveillance system of fatal and nonfatal cardiovascular events in the general population aged 35–74 years. It was launched in Italy at the end of the 1990s with the aim of estimating periodically the occurrence and case fatality rate of coronary and cerebrovascular events in the different geographical areas of the country. This paper presents data for cerebrovascular events. Methods Currentevents were assessed through record linkage between two sources of information: death certificates and hospital discharge diagnosis records. Events were identified through the ICD codes and duration. To calculate the number of estimated events, current events were multiplied by the positive predictive value of each specific mortality or discharge code derived from the validation of a sample of suspected events. Attack rates were calculated by dividing estimatedevents by resident population, and case fatality rate at 28 days was determined from the ratio of estimated fatal to total events. Results Attack rates were found to be higher in men than in women: mean age-standardized attack rate was 21.9/10,000 in men and 12.5/10,000 in women; age-standardized 28-day case fatality rate was higher in women (17.1%) than in men (14.5%). Significant geographical differences were found in attack rates of both men and women. Case fatality was significantly heterogeneous in both men and women. Conclusions Differences still exist in the geographical distribution of attack and case fatality rates of cerebrovascular events, regardless of the north-south gradient. These data show the feasibility of implementing a population-based register using a validated routine database, necessary for monitoring cardiovascular diseases. PMID:17971632
The Taranaki daylight fireball, 1999 July 7
NASA Astrophysics Data System (ADS)
McCormick, Jennie
2006-10-01
The New Zealand Taranaki Daylight Fireball was observed on 1999 July 7 from various areas across the North and South Islands of New Zealand and had an apparent magnitude brighter than -20. The event produced more than one hundred handwritten reports, drawings, and paintings from eyewitnesses; video and audio recordings, seismic trace data, and confirmation of detection by the United States Defense Department satellites. A detailed case study based on this data shows that observations by the public are invaluable when compiling a formal history of such events.
2007-09-01
The data are recorded at depth (1–5 km) by arrays of three-component geophones operated by AngloGold Ashanti, Ltd. and Integrated Seismic Systems...case-based event identification using regional arrays , Bull. Seism. Soc. Am. 80: 1874–1892. Bennett, T. J. and J. R. Murphy, Analysis of seismic ... seismic event classification at the NORESS array : seismological measurements and the use of trained neural networks, Bull. Seism. Soc. Am. 80: 1910
NASA Astrophysics Data System (ADS)
Xie, Ruifang C.; Marcantonio, Franco; Schmidt, Matthew W.
2012-09-01
Understanding intermediate water circulation across the last deglacial is critical in assessing the role of oceanic heat transport associated with Atlantic Meridional Overturning Circulation variability across abrupt climate events. However, the links between intermediate water circulation and abrupt climate events such as the Younger Dryas (YD) and Heinrich Event 1 (H1) are still poorly constrained. Here, we reconstruct changes in Antarctic Intermediate Water (AAIW) circulation in the subtropical North Atlantic over the past 25 kyr by measuring authigenic neodymium isotope ratios in sediments from two sites in the Florida Straits. Our authigenic Nd isotope records suggest that there was little to no penetration of AAIW into the subtropical North Atlantic during the YD and H1. Variations in the northward penetration of AAIW into the Florida Straits documented in our authigenic Nd isotope record are synchronous with multiple climatic archives, including the Greenland ice core δ18O record, the Cariaco Basin atmosphere Δ14C reconstruction, the Bermuda Rise sedimentary Pa/Th record, and nutrient and stable isotope data from the tropical North Atlantic. The synchroneity of our Nd records with multiple climatic archives suggests a tight connection between AAIW variability and high-latitude North Atlantic climate change.
Grocke, D.R.; Ludvigson, Greg A.; Witzke, B.L.; Robinson, S.A.; Joeckel, R.M.; Ufnar, David F.; Ravn, R.L.
2006-01-01
Analysis of bulk sedimentary organic matter and charcoal from an Albian-Cenomanian fluvial-estuarine succession (Dakota Formation) at Rose Creek Pit (RCP), Nebraska, reveals a negative excursion of ???3???, in late Albian strata. Overlying Cenomanian strata have ??13C values of -24???, to -23???, that are similar to pre-excursion values. The absence of an intervening positive excursion (as exists in marine records of the Albian-Cenomanian boundary) likely results from a depositional hiatus. The corresponding positive ??13C event and proposed depositional hiatus are concordant with a regionally identified sequence boundary in the Dakota Formation (D2), as well as a major regressive phase throughout the globe at the Albian-Cenomanian boundary. Data from RCP confirm suggestions that some positive carbon-isotope excursions in the geologic record are coincident with regressive sea-level phases. We estimate using isotopic correlation that the D2 sequence boundary at RCP was on the order of 0.5 m.y. in duration. Therefore, interpretations of isotopic events and associated environmental phenomena, such as oceanic anoxic events, in the shallow-marine and terrestrial record may be influenced by stratigraphic incompleteness. Further investigation of terrestrial ??13C records may be useful in recognizing and constraining sea-level changes in the geologic record. ?? 2006 Geological Society of America.
Challenges in Microseismic Monitoring of Hydrualic Fracturing
NASA Astrophysics Data System (ADS)
Venkataraman, A.; Li, R.
2011-12-01
To enhance well productivity, hydraulic fractures are stimulated by injecting fluid and/or gas with proppant into the rock matrix. This results in stress perturbations that induce fractures in the formation releasing minor amounts of seismic energy as microseismic events. Microseismicity can be recorded by properly positioned geophones and is one of the indirect methods that allow us to determine the actual volume of rock that was impacted during and after hydraulic fracturing. Specifically, microseismic data is acquired during hydro-fracture treatments to validate and assist completions, assist in placing wells in the formation, identify frac barriers, and to illuminate faults and potential fault re-activation. In the industry, microseismic data is acquired using geophones deployed in borehole and/or surface arrays. Borehole arrays are more traditional and have been used for nearly 20 years. Event location using borehole data is fairly robust, but azimuth and aperture are limited. Moreover, having dedicated boreholes can be expensive. The newer method of acquiring data is the use of geophones deployed on the surface or in shallow boreholes. Since microseismic events are very small (-4 to -0.5), surface records have weak P and S arrivals that are buried in the noise and traditional event location methods which use arrival time picks cannot be used. Migration based approaches which rely on the power of stacking waveforms is the common approach. However, poor signal-to-noise data and polarity in seismic waves generated by micro-earthquakes can result in uncertainty in event location. In this paper, we will discuss the pros and cons of both arrays, the status of the technology, its limitations and challenges. Specifically, we will focus on applications where industry-academic collaborations could lead to step changes in our understanding of the controls on microseismicity.
Near-simultaneous great earthquakes at Tongan megathrust and outer rise in September 2009.
Beavan, J; Wang, X; Holden, C; Wilson, K; Power, W; Prasetya, G; Bevis, M; Kautoke, R
2010-08-19
The Earth's largest earthquakes and tsunamis are usually caused by thrust-faulting earthquakes on the shallow part of the subduction interface between two tectonic plates, where stored elastic energy due to convergence between the plates is rapidly released. The tsunami that devastated the Samoan and northern Tongan islands on 29 September 2009 was preceded by a globally recorded magnitude-8 normal-faulting earthquake in the outer-rise region, where the Pacific plate bends before entering the subduction zone. Preliminary interpretation suggested that this earthquake was the source of the tsunami. Here we show that the outer-rise earthquake was accompanied by a nearly simultaneous rupture of the shallow subduction interface, equivalent to a magnitude-8 earthquake, that also contributed significantly to the tsunami. The subduction interface event was probably a slow earthquake with a rise time of several minutes that triggered the outer-rise event several minutes later. However, we cannot rule out the possibility that the normal fault ruptured first and dynamically triggered the subduction interface event. Our evidence comes from displacements of Global Positioning System stations and modelling of tsunami waves recorded by ocean-bottom pressure sensors, with support from seismic data and tsunami field observations. Evidence of the subduction earthquake in global seismic data is largely hidden because of the earthquake's slow rise time or because its ground motion is disguised by that of the normal-faulting event. Earthquake doublets where subduction interface events trigger large outer-rise earthquakes have been recorded previously, but this is the first well-documented example where the two events occur so closely in time and the triggering event might be a slow earthquake. As well as providing information on strain release mechanisms at subduction zones, earthquakes such as this provide a possible mechanism for the occasional large tsunamis generated at the Tonga subduction zone, where slip between the plates is predominantly aseismic.
NASA Astrophysics Data System (ADS)
Watts, W. A.; Allen, J. R. M.; Huntley, B.
A high-resolution palynological study of a 51 m core from Lago Grande di Monticchio, southern Italy, has provided a palaeonvironmental record for the last glacill. A annual lamination based chronology, supported by radiometric and tephrochronological dates, provides an absolute timescale for this record that spans 76,300 years. Correlations are established between the pollen stratigraphy, the GRIP ice core δ18O record and foraminiferal assemblages from Atlantic core V23-81. Both Dansgaard-Oeschger and Heinrich events are reflected by changes in the pollen stratigraphy. Revised dates are estimated for Heinrich events H1-H6. A quantitative palaeoclimate reconstruction based upon the pollen data provides evidence of the climate changes in southern Italy associated with these and other fluctuations during the last glacial.
NASA Astrophysics Data System (ADS)
Veres, D.; Bazin, L.; Landais, A.; Toyé Mahamadou Kele, H.; Lemieux-Dudon, B.; Parrenin, F.; Martinerie, P.; Blayo, E.; Blunier, T.; Capron, E.; Chappellaz, J.; Rasmussen, S. O.; Severi, M.; Svensson, A.; Vinther, B.; Wolff, E. W.
2012-12-01
The deep polar ice cores provide reference records commonly employed in global correlation of past climate events. However, temporal divergences reaching up to several thousand years (ka) exist between ice cores over the last climatic cycle. In this context, we are hereby introducing the Antarctic Ice Core Chronology 2012 (AICC2012), a new and coherent timescale developed for four Antarctic ice cores, namely Vostok, EPICA Dome C (EDC), EPICA Dronning Maud Land (EDML) and Talos Dome (TALDICE), alongside the Greenlandic NGRIP record. The AICC2012 time scale has been constructed using the Bayesian tool Datice (Lemieux-Dudon et al., 2010) that combines glaciological inputs and data constraints, including a wide range of relative and absolute gas and ice stratigraphic markers. We focus here on the last 120 ka, whereas the companion paper by Bazin et al., (2012) focuses on the interval 120-800 ka. Compared to previous timescales, AICC2012 presents an improved timing for the last glacial inception respecting the glaciological constraints of all analyzed records. Moreover, with the addition of numerous new stratigraphic markers and improved calculation of the lock-in depth (LID) based on δ15N data employed as the Datice background scenario, the AICC2012 presents a new timing for the bipolar sequence of events over Marine Isotope Stage 3 associated with the see-saw mechanism, with maximum differences of about 500 yr with respect to the previous Datice-derived chronology of Lemieux-Dudon et al. (2010), hereafter denoted LD2010. Our improved scenario confirms the regional differences for the millennial scale variability over the last glacial period: while the EDC isotopic record (events of triangular shape) displays peaks roughly at the same time as the NGRIP abrupt isotopic increases, the EDML isotopic record (events characterized by broader peaks or even extended periods of high isotope values) reached the isotopic maximum several centuries before.
Historical tsunami in the Azores archipelago (Portugal)
NASA Astrophysics Data System (ADS)
Andrade, C.; Borges, P.; Freitas, M. C.
2006-08-01
Because of its exposed northern mid-Atlantic location, morphology and plate-tectonics setting, the Azores Archipelago is highly vulnerable to tsunami hazards associated with landslides and seismic or volcanic triggers, local or distal. Critical examination of available data - written accounts and geologic evidence - indicates that, since the settlement of the archipelago in the 15th century, at least 23 tsunami have struck Azorean coastal zones. Most of the recorded tsunami are generated by earthquakes. The highest known run-up (11-15 m) was recorded on 1 November 1755 at Terceira Island, corresponding to an event of intensity VII-VIII (damaging-heavily damaging) on the Papadopolous-Imamura scale. To date, eruptive activity, while relatively frequent in the Azores, does not appear to have generated destructive tsunami. However, this apparent paucity of volcanogenic tsunami in the historical record may be misleading because of limited instrumental and documentary data, and small source-volumes released during historical eruptions. The latter are in contrast with the geological record of massive pyroclastic flows and caldera explosions with potential to generate high-magnitude tsunami, predating settlement. In addition, limited evidence suggests that submarine landslides from unstable volcano flanks may have also triggered some damaging tsunamigenic floods that perhaps were erroneously attributed to intense storms. The lack of destructive tsunami since the mid-18th century has led to governmental complacency and public disinterest in the Azores, as demonstrated by the fact that existing emergency regulations concerning seismic events in the Azores Autonomous Region make no mention of tsunami and their attendant hazards. We suspect that the coastal fringe of the Azores may well preserve a sedimentary record of some past tsunamigenic flooding events. Geological field studies must be accelerated to expand the existing database to include prehistoric events-information essential for more precisely estimating the average tsunami recurrence rate for the Azores over a longer period. A present-day occurrence of a moderate to intense tsunami (i.e., the size of the 1755 event) would produce societal disruption and economic loss orders of magnitudes greater than those of previous events in Azorean history. To reduce risk from future tsunami, comprehensive assessment of tsunami hazards and the preparation of hazards-zonation maps are needed to guide governmental decisions on issues of prudent land-use planning, public education and emergency management.
NASA Astrophysics Data System (ADS)
Jechumtálová, Z.; Šílený, J.; Trifu, C.-I.
2014-06-01
The resolution of event mechanism is investigated in terms of the unconstrained moment tensor (MT) source model and the shear-tensile crack (STC) source model representing a slip along the fault with an off-plane component. Data are simulated as recorded by the actual seismic array installed at Ocnele Mari (Romania), where sensors are placed in shallow boreholes. Noise is included as superimposed on synthetic data, and the analysis explores how the results are influenced (i) by data recorded by the complete seismic array compared to that provided by the subarray of surface sensors, (ii) by using three- or one-component sensors and (iii) by inverting P- and S-wave amplitudes versus P-wave amplitudes only. The orientation of the pure shear fracture component is resolved almost always well. On the other hand, the noise increase distorts the non-double-couple components (non-DC) of the MT unless a high-quality data set is available. The STC source model yields considerably less spurious non-shear fracture components. Incorporating recordings at deeper sensors in addition to those obtained from the surface ones allows for the processing of noisier data. Performance of the network equipped with three-component sensors is only slightly better than that with uniaxial sensors. Inverting both P- and S-wave amplitudes compared to the inversion of P-wave amplitudes only markedly improves the resolution of the orientation of the source mechanism. Comparison of the inversion results for the two alternative source models permits the assessment of the reliability of non-shear components retrieved. As example, the approach is investigated on three microseismic events occurred at Ocnele Mari, where both large and small non-DC components were found. The analysis confirms a tensile fracturing for two of these events, and a shear slip for the third.
An effective noise-suppression technique for surface microseismic data
Forghani-Arani, Farnoush; Willis, Mark; Haines, Seth S.; Batzle, Mike; Behura, Jyoti; Davidson, Michael
2013-01-01
The presence of strong surface-wave noise in surface microseismic data may decrease the utility of these data. We implement a technique, based on the distinct characteristics that microseismic signal and noise show in the τ‐p domain, to suppress surface-wave noise in microseismic data. Because most microseismic source mechanisms are deviatoric, preprocessing is necessary to correct for the nonuniform radiation pattern prior to transforming the data to the τ‐p domain. We employ a scanning approach, similar to semblance analysis, to test all possible double-couple orientations to determine an estimated orientation that best accounts for the polarity pattern of any microseismic events. We then correct the polarity of the data traces according to this pattern, prior to conducting signal-noise separation in the τ‐p domain. We apply our noise-suppression technique to two surface passive-seismic data sets from different acquisition surveys. The first data set includes a synthetic microseismic event added to field passive noise recorded by an areal receiver array distributed over a Barnett Formation reservoir undergoing hydraulic fracturing. The second data set is field microseismic data recorded by receivers arranged in a star-shaped array, over a Bakken Shale reservoir during a hydraulic-fracturing process. Our technique significantly improves the signal-to-noise ratios of the microseismic events and preserves the waveforms at the individual traces. We illustrate that the enhancement in signal-to-noise ratio also results in improved imaging of the microseismic hypocenter.
NASA Astrophysics Data System (ADS)
Niedermeyer, E. M.; Mulch, A.; Pross, J.
2017-12-01
The "8.2 ka event" has been an abrupt and prominent climate perturbation during the Holocene, and is characterized by an episode of generally colder and dryer conditions in the Northern Hemisphere realm. However, evidence to what extent this event has had an impact on climate in the Mediterranean region is ambiguous, in particular with respect to rainfall, temperature and vegetation change on land. Here we present a new, high-resolution record (ø 15 years during the event) of paleotemperatures from the Tenaghi Philippon peat deposit, Eastern Macedonia, Greece, using the MBT'/CBT index based on brGDGTs (branched Glycerol-Dialkyl-Glycerol-Tetraethers). Our data show fairly stable temperatures before the event, which is initiated at 8.1 ka by an abrupt and continuous cooling during the first 35 years of the event. After a short, 10-year episode of minimum temperatures, the event is ended by a similarly abrupt and continuous warming within 38 years. Comparison of our record with a previous study of the stable hydrogen isotopic composition of higher-plant waxes (δDwax) on the same core1 shows that changes in temperature occurred simultaneously with shifts in atmospherics moisture sources (Mediterranean vs Atlantic). Interestingly, further comparison of our data with a previous palynological study of the same core2 reveals that changes in vegetation associated with the 8.2 ka event precede shifts in hydrology and temperature by 100 years. This suggests either pronounced changes in seasonality of temperature and rainfall after the onset of the 8.2 ka event, i.e. at the peak of the event, or that changes in local atmospheric circulation (moisture sources) and temperature where not the initial trigger of changes in vegetation. References: Pross, J., Kotthoff, U., Müller, U.C., Peyron, O., Dormoy, I., Schmiedl, G., Kalaitzidis, S. and Smith, A.M. (2009): Massive perturbation in terrestrial ecosystems of the Eastern Mediterranean region associated with the 8.2 kyr B.P. climatic event. Geology 37, 887-890. Schemmel, F., Niedermeyer, E.M., Schwab, V.F., Gleixner, G., Pross, J. and Mulch, A. (2016): Plant wax δD values record changing Eastern Mediterranean atmospheric circulation patterns during the 8.2 kyr B.P. climatic event. Quaternary Science Reviews 133, 96-107.
Experiences with hypercube operating system instrumentation
NASA Technical Reports Server (NTRS)
Reed, Daniel A.; Rudolph, David C.
1989-01-01
The difficulties in conceptualizing the interactions among a large number of processors make it difficult both to identify the sources of inefficiencies and to determine how a parallel program could be made more efficient. This paper describes an instrumentation system that can trace the execution of distributed memory parallel programs by recording the occurrence of parallel program events. The resulting event traces can be used to compile summary statistics that provide a global view of program performance. In addition, visualization tools permit the graphic display of event traces. Visual presentation of performance data is particularly useful, indeed, necessary for large-scale parallel computers; the enormous volume of performance data mandates visual display.
Personal miniature electrophysiological tape recorder
NASA Astrophysics Data System (ADS)
Green, H.
1981-11-01
The use of a personal miniature electrophysiological tape recorder to measure the physiological reactions of space flight personnel to space flight stress and weightlessness is described. The Oxford Instruments Medilog recorder, a battery-powered, four-channel cassette tape recorder with 24 hour endurance is carried on the person and will record EKG, EOG, EEG, and timing and event markers. The data will give information about heart rate and morphology changes, and document adaptation to zero gravity on the part of subjects who, unlike highly trained astronauts, are more representative of the normal population than were the subjects of previous space flight studies.
ERIC Educational Resources Information Center
Tokowicz, Natasha; MacWhinney, Brian
2005-01-01
We used event-related brain potentials (ERPs) to investigate the contributions of explicit and implicit processes during second language (L2) sentence comprehension. We used a L2 grammaticality judgment task (GJT) to test 20 native English speakers enrolled in the first four semesters of Spanish while recording both accuracy and ERP data. Because…
What Makes Usain Bolt Unique as a Sprinter?
ERIC Educational Resources Information Center
Shinabargar, A. J.; Hellrich, Matt; Baker, Blane
2010-01-01
For both casual and avid fans alike, Olympic and other sporting events can provide a wealth of data for simple physics analyses. One of the most impressive performances in recent Olympic history is that of Usain Bolt in the track-and-field sprinting events during the 2008 Summer Games. Over a seven-day span, Bolt set world records in the 100-m and…
NASA Astrophysics Data System (ADS)
Magny, Michel; de Beaulieu, Jacques-Louis; Drescher-Schneider, Ruth; Vannière, Boris; Walter-Simonnet, Anne-Véronique; Millet, Laurent; Bossuet, Gilles; Peyron, Odile
2006-05-01
This paper presents an event stratigraphy based on data documenting the history of vegetation cover, lake-level changes and fire frequency, as well as volcanic eruptions, over the Last Glacial-early Holocene transition from a terrestrial sediment sequence recovered at Lake Accesa in Tuscany (north-central Italy). On the basis of an age-depth model inferred from 13 radiocarbon dates and six tephra horizons, the Oldest Dryas-Bølling warming event was dated to ca. 14 560 cal. yr BP and the Younger Dryas event to ca. 12 700-11 650 cal. yr BP. Four sub-millennial scale cooling phases were recognised from pollen data at ca. 14 300-14 200, 13 900-13 700, 13 400-13 100 and 11 350-11 150 cal.yrBP. The last three may be Mediterranean equivalents to the Older Dryas (GI-1d), Intra-Allerød (GI-1b) and Preboreal Oscillation (PBO) cooling events defined from the GRIP ice-core and indicate strong climatic linkages between the North Atlantic and Mediterranean areas during the last Termination. The first may correspond to Intra-Bølling cold oscillations registered by various palaeoclimatic records in the North Atlantic region. The lake-level record shows that the sub-millennial scale climatic oscillations which punctuated the last deglaciation were associated in central Italy with different successive patterns of hydrological changes from the Bølling warming to the 8.2ka cold reversal. Copyright
Doğramac, Sera N; Watsford, Mark L; Murphy, Aron J
2011-03-01
Subjective notational analysis can be used to track players and analyse movement patterns during match-play of team sports such as futsal. The purpose of this study was to establish the validity and reliability of the Event Recorder for subjective notational analysis. A course was designed, replicating ten minutes of futsal match-play movement patterns, where ten participants undertook the course. The course allowed a comparison of data derived from subjective notational analysis, to the known distances of the course, and to GPS data. The study analysed six locomotor activity categories, focusing on total distance covered, total duration of activities and total frequency of activities. The values between the known measurements and the Event Recorder were similar, whereas the majority of significant differences were found between the Event Recorder and GPS values. The reliability of subjective notational analysis was established with all ten participants being analysed on two occasions, as well as analysing five random futsal players twice during match-play. Subjective notational analysis is a valid and reliable method of tracking player movements, and may be a preferred and more effective method than GPS, particularly for indoor sports such as futsal, and field sports where short distances and changes in direction are observed.
VA Suicide Prevention Applications Network
Stephens, Brady; Morley, Sybil; Thompson, Caitlin; Kemp, Janet; Bossarte, Robert M.
2016-01-01
Objectives: The US Department of Veterans Affairs’ Suicide Prevention Applications Network (SPAN) is a national system for suicide event tracking and case management. The objective of this study was to assess data on suicide attempts among people using Veterans Health Administration (VHA) services. Methods: We assessed the degree of data overlap on suicide attempters reported in SPAN and the VHA’s medical records from October 1, 2010, to September 30, 2014—overall, by year, and by region. Data on suicide attempters in the VHA’s medical records consisted of diagnoses documented with E95 codes from the International Classification of Diseases, Ninth Revision. Results: Of 50 518 VHA patients who attempted suicide during the 4-year study period, data on fewer than half (41%) were reported in both SPAN and the medical records; nearly 65% of patients whose suicide attempt was recorded in SPAN had no data on attempted suicide in the VHA’s medical records. Conclusion: Evaluation of administrative data suggests that use of SPAN substantially increases the collection of data on suicide attempters as compared with the use of medical records alone, but neither SPAN nor the VHA’s medical records identify all suicide attempters. Further research is needed to better understand the strengths and limitations of both systems and how to best combine information across systems. PMID:28123228
Relation between century-scale Holocene arid intervals in tropical and temperate zones
NASA Astrophysics Data System (ADS)
Lamb, H. F.; Gasse, F.; Benkaddour, A.; El Hamouti, N.; van der Kaars, S.; Perkins, W. T.; Pearce, N. J.; Roberts, C. N.
1995-01-01
CLIMATE records from lake sediments in tropical Africa, Central America and west Asia show several century-scale arid intervals during the Holocene1-10. These may have been caused by temporary weakening of the monsoonal circulation associated with reduced northward heat transport by the oceans7 or by feedback processes stimulated by changes in tropical land-surface conditions10. Here we use a lake-sediment record from the montane Mediterranean zone of Morocco to address the question of whether these events were also felt in temperate continental regions. We find evidence of arid intervals of similar duration, periodicity and possibly timing to those in the tropics. But our pollen data show that the forest vegetation was not substantially affected by these events, indicating that precipitation remained adequate during the summer growing season. Thus, the depletion of the groundwater aquifer that imprinted the dry events in the lake record must have resulted from reduced winter precipitation. We suggest that the occurrence of arid events during the summer in the tropics but during the winter at temperate latitudes can be rationalized if they are both associated with cooler sea surface temperatures in the North Atlantic.
Using damage data to estimate the risk from summer convective precipitation extremes
NASA Astrophysics Data System (ADS)
Schroeer, Katharina; Tye, Mari
2017-04-01
This study explores the potential added value from including loss and damage data to understand the risks from high-intensity short-duration convective precipitation events. Projected increases in these events are expected even in regions that are likely to become more arid. Such high intensity precipitation events can trigger hazardous flash floods, debris flows, and landslides that put people and local assets at risk. However, the assessment of local scale precipitation extremes is hampered by its high spatial and temporal variability. In addition to this, not only are extreme events rare, but such small-scale events are likely to be underreported where they do not coincide with the observation network. Reports of private loss and damage on a local administrative unit scale (LAU 2 level) are used to explore the relationship between observed rainfall events and damages reportedly related to hydro-meteorological processes. With 480 Austrian municipalities located within our south-eastern Alpine study region, the damage data are available on a much smaller scale than the available rainfall data. Precipitation is recorded daily at 185 gauges and 52% of these stations additionally deliver sub-hourly rainfall information. To obtain physically plausible information, damage and rainfall data are grouped and analyzed on a catchment scale. The data indicate that rainfall intensities are higher on days that coincide with a damage claim than on days for which no damage was reported. However, approximately one third of the damages related to hydro-meteorological hazards were claimed on days for which no rainfall was recorded at any gauge in the respective catchment. Our goal is to assess whether these events indicate potential extreme events missing in the observations. Damage always is a consequence of an asset being exposed and susceptible to a hazardous process, and naturally, many factors influence whether an extreme rainfall event causes damage. We set up a statistical model to test whether the relationship between extreme rainfall events and damages is robust enough to estimate a potential underrepresentation of high intensity rainfall events in ungauged areas. Risk-relevant factors of socio-economic vulnerability, land cover, streamflow data, and weather type information are included to improve and sharpen the analysis. Within this study, we first aim to identify which rainfall events are most damaging and which factors affect the damages - seen as a proxy for the vulnerability - related to summer convective rainfall extremes in different catchment types. Secondly, we aim to detect potentially unreported damaging rainfall events and estimate the likelihood of such cases. We anticipate this damage perspective on summertime extreme convective precipitation to be beneficial for risk assessment, uncertainty management, and decision making with respect to weather and climate extremes on the regional-to-local level.
NASA Astrophysics Data System (ADS)
Washington-Allen, R. A.; Therrell, M. D.; Emanuel, R. E.
2007-12-01
Herbivory, fire, and climatic events such as El Niño-Southern Oscillation (ENSO) and La Niña have been shown to have proximal and evolutionary effects on the dynamics of Dryland fauna, flora, and soils. However, spatially-explicit historical impacts of these climatic events on Dryland ecosystems is not known. Consequently, this paper has the purpose of presenting the theory and practical application for estimating the historical spatial impacts of these climatic events. We hypothesize that if remotely-sensed vegetation indices (VI) are correlated to historical tree ring data and also to functional ecosystem processes, specifically gross primary productivity (GPP) and net ecosystem production (NEP) as measured by eddy covariance flux towers, then VIs can be used to spatially and temporally distribute GPP and NEP within the species- or community-specific land cover extent over the length of the tree ring record of selected Dryland ecosystems. Secondly, the Shuttle Radar Topography Mission (SRTM) digital terrain model (DTM) data has been used to estimate tree height and in conjuction with plant allometric equations: biomass and standing carbon in various forest ecosystems. Tree height data in relation to tree ring age data and fire history can be used to reconstruct the spatial distribution of savanna demographic age structure, predict standing carbon and thus provide a complementary and independent dataset for comparison to DTMs from Multiangle Imaging Spectroradiometer (MISR), Interferometric Synthetic Aperture Radar (IFSAR), and Moderate Resolution Imaging Spectroradiometer (MODIS) derived GPP spatial maps. We developed a database consisting of a dendrochronology record, SRTM data, globa fre history data, Long term Data Record Advanced Very High Resolution Radiometer Normalized Difference Vegetation Index (LTDR AVHRR NDVI, 1981 - 2003), contemporary gridded climate data, National Land Cover Data (NLCD), and short term eddy covariance flux tower data for the California Blue Oak woodland ecosystem to estimate both regional aboveground productivity and past disturbance history relative climate, particularly droughts, for the last 500 years.
Kucera, Kristen L.; Marshall, Stephen W.; Bell, David R.; DiStefano, Michael J.; Goerger, Candice P.; Oyama, Sakiko
2011-01-01
Context: Few validation studies of sport injury-surveillance systems are available. Objective: To determine the validity of a Web-based system for surveillance of collegiate sport injuries, the Injury Surveillance System (ISS) of the National Collegiate Athletic Association's (NCAA). Design: Validation study comparing NCAA ISS data from 2 fall collegiate sports (men's and women's soccer) with other types of clinical records maintained by certified athletic trainers. Setting: A purposive sample of 15 NCAA colleges and universities that provided NCAA ISS data on both men's and women's soccer for at least 2 years during 2005–2007, stratified by playing division. Patients or Other Participants: A total of 737 men's and women's soccer athletes and 37 athletic trainers at these 15 institutions. Main Outcome Measure(s): The proportion of injuries captured by the NCAA ISS (capture rate) was estimated by comparing NCAA ISS data with the other clinical records on the same athletes maintained by the athletic trainers. We reviewed all athletic injury events resulting from participation in NCAA collegiate sports that resulted in 1 day or more of restricted activity in games or practices and necessitated medical care. A capture-recapture analysis estimated the proportion of injury events captured by the NCAA ISS. Agreement for key data fields was also measured. Results: We analyzed 664 injury events. The NCAA ISS captured 88.3% (95% confidence interval = 85.9%, 90.8%) of all time-lost medical-attention injury events. The proportion of injury events captured by the NCAA ISS was higher in Division I (93.8%) and Division II (89.6%) than in Division III (82.3%) schools. Agreement between the NCAA ISS data and the non–NCAA ISS data was good for the majority of data fields but low for date of full return and days lost from sport participation. Conclusions: The overall capture rate of the NCAA ISS was very good (88%) in men's and women's soccer for this period. PMID:22488136
NASA Astrophysics Data System (ADS)
von Reumont, J.; Hetzinger, S.; Garbe-Schönberg, D.; Manfrino, C.; Dullo, W.-Chr.
2016-03-01
The rising temperature of the world's oceans is affecting coral reef ecosystems by increasing the frequency and severity of bleaching and mortality events. The susceptibility of corals to temperature stress varies on local and regional scales. Insights into potential controlling parameters are hampered by a lack of long term in situ data in most coral reef environments and sea surface temperature (SST) products often do not resolve reef-scale variations. Here we use 42 years (1970-2012) of coral Sr/Ca data to reconstruct seasonal- to decadal-scale SST variations in two adjacent but distinct reef environments at Little Cayman, Cayman Islands. Our results indicate that two massive Diploria strigosa corals growing in the lagoon and in the fore reef responded differently to past warming events. Coral Sr/Ca data from the shallow lagoon successfully record high summer temperatures confirmed by in situ observations (>33°C). Surprisingly, coral Sr/Ca from the deeper fore reef is strongly affected by thermal stress events, although seasonal temperature extremes and mean SSTs at this site are reduced compared to the lagoon. The shallow lagoon coral showed decadal variations in Sr/Ca, supposedly related to the modulation of lagoonal temperature through varying tidal water exchange, influenced by the 18.6 year lunar nodal cycle. Our results show that reef-scale SST variability can be much larger than suggested by satellite SST measurements. Thus, using coral SST proxy records from different reef zones combined with in situ observations will improve conservation programs that are developed to monitor and predict potential thermal stress on coral reefs.
77 FR 48492 - Event Data Recorders
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-14
... that [cir] Involve side or side curtain/tube air bags such that EDR data would only need to be locked... deployable restraints other than frontal, side or side/curtain air bags such that EDR data would not need to.... The final rule was intended to be technology-neutral, so as to permit compliance with any available...
Holocene geological records of flood regime in French Alps
NASA Astrophysics Data System (ADS)
Arnaud, Fabien; Wilhelm, Bruno; Giguet-Covex, Charline; Jenny, Jean-Philippe; Fouinat, Laurent; Sabatier, Pierre; Debret, Maxime; Révillon, Sidonie; Chapron, Emmanuel; Revel, Marie
2014-05-01
In this paper we present a review of a ca. 10-years research effort (1-9) aiming at reconstructing floods dynamics in in French Alps through the Holocene, based on lake sediment records. We will particularly discuss how such geological records can be considered as representative of past climate. This implies a wise interpretation of data in order to really understand "what does the core really says". Namely, we showed that different lake systems record different types of flood events. Low altitude lakes, fed by large-scale catchment areas are more sensitive to regional heavy rainfall events (2-5), whereas high altitude small lakes record local extreme rainfall events (6). Moreover, human societies' development must be taken into account as it is susceptible to modulate the climate-geological record relationship (7). Altogether our data permit the establishment of a Holocene-long perspective upon both regional heavy rainfall and torrential activities in high elevation sites. We hence show that both types of events frequency co-evolve in Northern as well as Southern French Alps where Holocene colder spells generally present higher flood frequencies (6-9). On the other hand, intensities of torrential events present a North-South opposite pattern: during warm spells (e.g. the Medieval Warm Period or nowadays), northern Alps are subject to rare but extremely intense heavy rainfall events, whereas in the southern Alps torrential floods are both rare and weak. During cold spells (e.g. the Little Ice Age), the inverse pattern is observed: torrential floods are more frequent everywhere and above-average intensity in Southern Alps. This point is particularly important for risk management in mountain areas in a context of global warming. Our results point out how complex can be the response of regional system to global climate changes. We are hence far from completely understanding this complexity which is moreover imperfectly simulated by climate models. As geological records represent the only way to reconstruct long-term trends in flood regimes, more efforts must still be pursued to get a more complete image of this complexity and further improve climate models. 1. Chapron et al. The Holocene 12, 177-185 (2002) 2. Arnaud et al. Quat. Sci. Rev. 51, 81-92 (2012) 3. Debret et al. Quat. Sci. Rev. 29, 2185-2200 (2010) 4. Arnaud et al. The Holocene 15, 420-428 (2005) 5. Revel-Rolland et al. Chem. Geol. 224, 183-200 (2005) 6. Wilhelm et al. Clim. Change 113, 563-581 (2012) 7. Giguet-Covex et al. Quat. Res. 77, 12-22 (2012) 8. Wilhelm et al. Quat. Res. 78, 1-12 (2012) 9. Wilhelm et al. J. Quat. Sci. 28, 189-199 (2013)
Hibbert, Peter D; Hallahan, Andrew R; Muething, Stephen E; Lachman, Peter; Hooper, Tamara D; Wiles, Louise K; Jaffe, Adam; White, Les; Wheaton, Gavin R; Runciman, William B; Dalton, Sarah; Williams, Helena M; Braithwaite, Jeffrey
2015-01-01
Introduction A high-quality health system should deliver care that is free from harm. Few large-scale studies of adverse events have been undertaken in children's healthcare internationally, and none in Australia. The aim of this study is to measure the frequency and types of adverse events encountered in Australian paediatric care in a range of healthcare settings. Methods and analysis A form of retrospective medical record review, the Institute of Healthcare Improvement's Global Trigger Tool, will be modified to collect data. Records of children aged <16 years managed during 2012 and 2013 will be reviewed. We aim to review 6000–8000 records from a sample of healthcare practices (hospitals, general practices and specialists). Ethics and dissemination Human Research Ethics Committee approvals have been received from the Sydney Children's Hospital Network, Children's Health Queensland Hospital and Health Service, and the Women's and Children's Hospital Network in South Australia. An application is under review with the Royal Australian College of General Practitioners. The authors will submit the results of the study to relevant journals and undertake national and international oral presentations to researchers, clinicians and policymakers. PMID:25854978
The periodic structure of the natural record, and nonlinear dynamics.
Shaw, H.R.
1987-01-01
This paper addresses how nonlinear dynamics can contribute to interpretations of the geologic record and evolutionary processes. Background is given to explain why nonlinear concepts are important. A resume of personal research is offered to illustrate why I think nonlinear processes fit with observations on geological and cosmological time series data. The fabric of universal periodicity arrays generated by nonlinear processes is illustrated by means of a simple computer mode. I conclude with implications concerning patterns of evolution, stratigraphic boundary events, and close correlations of major geologically instantaneous events (such as impacts or massive volcanic episodes) with any sharply defined boundary in the geologic column. - from Author
Engagement Skills Trainer: The Commander’s Perspective
2017-06-09
recommends using EST as a record of fire for a sustainment training event . This record of fire event can only occur once per year and after a live fire...mandatory part of marksmanship training. The author also recommends using EST as a record of fire for a sustainment training event . This record of fire... event can only occur once per year and after a live fire qualification. v ACKNOWLEDGMENTS The author would like to thank the following persons
29 CFR 6.17 - Amendments to pleadings.
Code of Federal Regulations, 2011 CFR
2011-07-01
... record left open to enable the new allegations to be addressed. The presiding Administrative Law Judge... forth transactions, occurrences or events which have happened since the data of the pleadings and which...
29 CFR 6.17 - Amendments to pleadings.
Code of Federal Regulations, 2012 CFR
2012-07-01
... record left open to enable the new allegations to be addressed. The presiding Administrative Law Judge... forth transactions, occurrences or events which have happened since the data of the pleadings and which...
29 CFR 6.17 - Amendments to pleadings.
Code of Federal Regulations, 2013 CFR
2013-07-01
... record left open to enable the new allegations to be addressed. The presiding Administrative Law Judge... forth transactions, occurrences or events which have happened since the data of the pleadings and which...
Boore, D.M.; Smith, C.E.
1999-01-01
For more than 20 years, a program has been underway to obtain records of earthquake shaking on the seafloor at sites offshore of southern California, near oil platforms. The primary goal of the program is to obtain data that can help determine if ground motions at offshore sites are significantly different than those at onshore sites; if so, caution may be necessary in using onshore motions as the basis for the seismic design of oil platforms. We analyze data from eight earthquakes recorded at six offshore sites; these are the most important data recorded on these stations to date. Seven of the earthquakes were recorded at only one offshore station; the eighth event was recorded at two sites. The earthquakes range in magnitude from 4.7 to 6.1. Because of the scarcity of multiple recordings from any one event, most of the analysis is based on the ratio of spectra from vertical and horizontal components of motion. The results clearly show that the offshore motions have very low vertical motions compared to those from an average onshore site, particularly at short periods. Theoretical calculations find that the water layer has little effect on the horizontal components of motion but that it produces a strong spectral null on the vertical component at the resonant frequency of P waves in the water layer. The vertical-to-horizontal ratios for a few selected onshore sites underlain by relatively low shear-wave velocities are similar to the ratios from offshore sites for frequencies less than about one-half the water layer P-wave resonant frequency, suggesting that the shear-wave velocities beneath a site are more important than the water layer in determining the character of the ground motions at lower frequencies.
Loggerhead Turtles (Caretta caretta) Use Vision to Forage on Gelatinous Prey in Mid-Water
Narazaki, Tomoko; Sato, Katsufumi; Abernathy, Kyler J.; Marshall, Greg J.; Miyazaki, Nobuyuki
2013-01-01
Identifying characteristics of foraging activity is fundamental to understanding an animals’ lifestyle and foraging ecology. Despite its importance, monitoring the foraging activities of marine animals is difficult because direct observation is rarely possible. In this study, we use an animal-borne imaging system and three-dimensional data logger simultaneously to observe the foraging behaviour of large juvenile and adult sized loggerhead turtles (Caretta caretta) in their natural environment. Video recordings showed that the turtles foraged on gelatinous prey while swimming in mid-water (i.e., defined as epipelagic water column deeper than 1 m in this study). By linking video and 3D data, we found that mid-water foraging events share the common feature of a marked deceleration phase associated with the capture and handling of the sluggish prey. Analysis of high-resolution 3D movements during mid-water foraging events, including presumptive events extracted from 3D data using deceleration in swim speed as a proxy for foraging (detection rate = 0.67), showed that turtles swam straight toward prey in 171 events (i.e., turning point absent) but made a single turn toward the prey an average of 5.7±6.0 m before reaching the prey in 229 events (i.e., turning point present). Foraging events with a turning point tended to occur during the daytime, suggesting that turtles primarily used visual cues to locate prey. In addition, an incident of a turtle encountering a plastic bag while swimming in mid-water was recorded. The fact that the turtle’s movements while approaching the plastic bag were analogous to those of a true foraging event, having a turning point and deceleration phase, also support the use of vision in mid-water foraging. Our study shows that integrated video and high-resolution 3D data analysis provides unique opportunities to understand foraging behaviours in the context of the sensory ecology involved in prey location. PMID:23776603
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manungu Kiveni, Joseph
2012-12-01
This dissertation describes the results of a WIMP search using CDMS II data sets accumulated at the Soudan Underground Laboratory in Minnesota. Results from the original analysis of these data were published in 2009; two events were observed in the signal region with an expected leakage of 0.9 events. Further investigation revealed an issue with the ionization-pulse reconstruction algorithm leading to a software upgrade and a subsequent reanalysis of the data. As part of the reanalysis, I performed an advanced discrimination technique to better distinguish (potential) signal events from backgrounds using a 5-dimensional chi-square method. This dataanalysis technique combines themore » event information recorded for each WIMP-search event to derive a backgrounddiscrimination parameter capable of reducing the expected background to less than one event, while maintaining high efficiency for signal events. Furthermore, optimizing the cut positions of this 5-dimensional chi-square parameter for the 14 viable germanium detectors yields an improved expected sensitivity to WIMP interactions relative to previous CDMS results. This dissertation describes my improved (and optimized) discrimination technique and the results obtained from a blind application to the reanalyzed CDMS II WIMP-search data.« less
Computational Electrocardiography: Revisiting Holter ECG Monitoring.
Deserno, Thomas M; Marx, Nikolaus
2016-08-05
Since 1942, when Goldberger introduced the 12-lead electrocardiography (ECG), this diagnostic method has not been changed. After 70 years of technologic developments, we revisit Holter ECG from recording to understanding. A fundamental change is fore-seen towards "computational ECG" (CECG), where continuous monitoring is producing big data volumes that are impossible to be inspected conventionally but require efficient computational methods. We draw parallels between CECG and computational biology, in particular with respect to computed tomography, computed radiology, and computed photography. From that, we identify technology and methodology needed for CECG. Real-time transfer of raw data into meaningful parameters that are tracked over time will allow prediction of serious events, such as sudden cardiac death. Evolved from Holter's technology, portable smartphones with Bluetooth-connected textile-embedded sensors will capture noisy raw data (recording), process meaningful parameters over time (analysis), and transfer them to cloud services for sharing (handling), predicting serious events, and alarming (understanding). To make this happen, the following fields need more research: i) signal processing, ii) cycle decomposition; iii) cycle normalization, iv) cycle modeling, v) clinical parameter computation, vi) physiological modeling, and vii) event prediction. We shall start immediately developing methodology for CECG analysis and understanding.
Strong, Vivian E.; Selby, Luke V.; Sovel, Mindy; Disa, Joseph J.; Hoskins, William; DeMatteo, Ronald; Scardino, Peter; Jaques, David P.
2015-01-01
Background Studying surgical secondary events is an evolving effort with no current established system for database design, standard reporting, or definitions. Using the Clavien-Dindo classification as a guide, in 2001 we developed a Surgical Secondary Events database based on grade of event and required intervention to begin prospectively recording and analyzing all surgical secondary events (SSE). Study Design Events are prospectively entered into the database by attending surgeons, house staff, and research staff. In 2008 we performed a blinded external audit of 1,498 operations that were randomly selected to examine the quality and reliability of the data. Results 1,498 of 4,284 operations during the 3rd quarter of 2008 were audited. 79% (N=1,180) of the operations did not have a secondary event while 21% (N=318) of operations had an identified event. 91% (1,365) of operations were correctly entered into the SSE database. 97% (129/133) of missed secondary events were Grades I and II. Three Grade III (2%) and one Grade IV (1%) secondary event were missed. There were no missed Grade 5 secondary events. Conclusion Grade III – IV events are more accurately collected than Grade I – II events. Robust and accurate secondary events data can be collected by clinicians and research staff and these data can safely be used for quality improvement projects and research. PMID:25319579
Strong, Vivian E; Selby, Luke V; Sovel, Mindy; Disa, Joseph J; Hoskins, William; Dematteo, Ronald; Scardino, Peter; Jaques, David P
2015-04-01
Studying surgical secondary events is an evolving effort with no current established system for database design, standard reporting, or definitions. Using the Clavien-Dindo classification as a guide, in 2001 we developed a Surgical Secondary Events database based on grade of event and required intervention to begin prospectively recording and analyzing all surgical secondary events (SSE). Events are prospectively entered into the database by attending surgeons, house staff, and research staff. In 2008 we performed a blinded external audit of 1,498 operations that were randomly selected to examine the quality and reliability of the data. Of 4,284 operations, 1,498 were audited during the third quarter of 2008. Of these operations, 79 % (N = 1,180) did not have a secondary event while 21 % (N = 318) had an identified event; 91 % of operations (1,365) were correctly entered into the SSE database. Also 97 % (129 of 133) of missed secondary events were grades I and II. There were 3 grade III (2 %) and 1 grade IV (1 %) secondary event that were missed. There were no missed grade 5 secondary events. Grade III-IV events are more accurately collected than grade I-II events. Robust and accurate secondary events data can be collected by clinicians and research staff, and these data can safely be used for quality improvement projects and research.
Source discrimination between Mining blasts and Earthquakes in Tianshan orogenic belt, NW China
NASA Astrophysics Data System (ADS)
Tang, L.; Zhang, M.; Wen, L.
2017-12-01
In recent years, a large number of quarry blasts have been detonated in Tianshan Mountains of China. It is necessary to discriminate those non-earthquake records from the earthquake catalogs in order to determine the real seismicity of the region. In this study, we have investigated spectral ratios and amplitude ratios as discriminants for regional seismic-event identification using explosions and earthquakes recorded at Xinjiang Seismic Network (XJSN) of China. We used a data set that includes 1071 earthquakes and 2881 non-earthquakes as training data recorded by the XJSN between years of 2009 and 2016, with both types of events in a comparable local magnitude range (1.5 to 2.9). The non-earthquake and earthquake groups were well separated by amplitude ratios of Pg/Sg, with the separation increasing with frequency when averaged over three stations. The 8- to 15-Hz Pg/Sg ratio was proved to be the most precise and accurate discriminant, which works for more than 90% of the events. In contrast, the P spectral ratio performed considerably worse with a significant overlap (about 60% overlap) between the earthquake and explosion populations. The comparison results show amplitude ratios between compressional and shear waves discriminate better than low-frequency to high-frequency spectral ratios for individual phases. In discriminating between explosions and earthquakes, none of two discriminants were able to completely separate the two populations of events. However, a joint discrimination scheme employing simple majority voting reduces misclassifications to 10%. In the region of the study, 44% of the examined seismic events were determined to be non-earthquakes and 55% to be earthquakes. The earthquakes occurring on land are related to small faults, while the blasts are concentrated in large quarries.
Data acquisition system for the Belle experiment
NASA Astrophysics Data System (ADS)
Nakao, M.; Yamauchi, M.; Suzuki, S. Y.; Itoh, R.; Fujii, H.
2000-04-01
We built a data acquisition system for the Belle experiment at the KEK B-factory. The system is designed to record the signals from the detectors at 500 Hz trigger rate with a less than 10% dead time fraction. A typical event size is 30 kbyte, which corresponds to a data transfer rate of 15 Mbyte/s. Main components are two kinds of detector readout systems, an event builder, an online computer farm and a data storage system. The system has been reliably in operation at the design performance for a half year. We have completed cosmic-ray data taking for 2.5 months and have started physics data taking on Jun. 1, 1999.
Xu, Stanley; Newcomer, Sophia; Nelson, Jennifer; Qian, Lei; McClure, David; Pan, Yi; Zeng, Chan; Glanz, Jason
2014-05-01
The Vaccine Safety Datalink project captures electronic health record data including vaccinations and medically attended adverse events on 8.8 million enrollees annually from participating managed care organizations in the United States. While the automated vaccination data are generally of high quality, a presumptive adverse event based on diagnosis codes in automated health care data may not be true (misclassification). Consequently, analyses using automated health care data can generate false positive results, where an association between the vaccine and outcome is incorrectly identified, as well as false negative findings, where a true association or signal is missed. We developed novel conditional Poisson regression models and fixed effects models that accommodate misclassification of adverse event outcome for self-controlled case series design. We conducted simulation studies to evaluate their performance in signal detection in vaccine safety hypotheses generating (screening) studies. We also reanalyzed four previously identified signals in a recent vaccine safety study using the newly proposed models. Our simulation studies demonstrated that (i) outcome misclassification resulted in both false positive and false negative signals in screening studies; (ii) the newly proposed models reduced both the rates of false positive and false negative signals. In reanalyses of four previously identified signals using the novel statistical models, the incidence rate ratio estimates and statistical significances were similar to those using conventional models and including only medical record review confirmed cases. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Y.; Gohar, Y.; Nuclear Engineering Division
In almost every detector counting system, a minimal dead time is required to record two successive events as two separated pulses. Due to the random nature of neutron interactions in the subcritical assembly, there is always some probability that a true neutron event will not be recorded because it occurs too close to the preceding event. These losses may become rather severe for counting systems with high counting rates, and should be corrected before any utilization of the experimental data. This report examines the dead time effects for the pulsed neutron experiments of the YALINA-Booster subcritical assembly. The nonparalyzable modelmore » is utilized to correct the experimental data due to dead time. Overall, the reactivity values are increased by 0.19$ and 0.32$ after the spatial corrections for the YALINA-Booster 36% and 21% configurations respectively. The differences of the reactivities obtained with He-3 long or short detectors at the same detector channel diminish after the dead time corrections of the experimental data for the 36% YALINA-Booster configuration. In addition, better agreements between reactivities obtained from different experimental data sets are also observed after the dead time corrections for the 21% YALINA-Booster configuration.« less
Xiao, Y; MacKenzie, C; Orasanu, J; Spencer, R; Rahman, A; Gunawardane, V
1999-01-01
To determine what information sources are used during a remote diagnosis task. Experienced trauma care providers viewed segments of videotaped initial trauma patient resuscitation and airway management. Experiment 1 collected responses from anesthesiologists to probing questions during and after the presentation of recorded video materials. Experiment 2 collected the responses from three types of care providers (anesthesiologists, nurses, and surgeons). Written and verbal responses were scored according to detection of critical events in video materials and categorized according to their content. Experiment 3 collected visual scanning data using an eyetracker during the viewing of recorded video materials from the three types of care providers. Eye-gaze data were analyzed in terms of focus on various parts of the videotaped materials. Care providers were found to be unable to detect several critical events. The three groups of subjects studied (anesthesiologists, nurses, and surgeons) focused on different aspects of videotaped materials. When the remote events and activities are multidisciplinary and rapidly changing, experts linked with audio-video-data connections may encounter difficulties in comprehending remote activities, and their information usage may be biased. Special training is needed for the remote decision-maker to appreciate tasks outside his or her speciality and beyond the boundaries of traditional divisions of labor.
Analysis and suppression of passive noise in surface microseismic data
NASA Astrophysics Data System (ADS)
Forghani-Arani, Farnoush
Surface microseismic surveys are gaining popularity in monitoring the hydraulic fracturing process. The effectiveness of these surveys, however, is strongly dependent on the signal-to-noise ratio of the acquired data. Cultural and industrial noise generated during hydraulic fracturing operations usually dominate the data, thereby decreasing the effectiveness of using these data in identifying and locating microseismic events. Hence, noise suppression is a critical step in surface microseismic monitoring. In this thesis, I focus on two important aspects in using surface-recorded microseismic seismic data: first, I take advantage of the unwanted surface noise to understand the characteristics of these noise and extract information about the propagation medium from the noise; second, I propose effective techniques to suppress the surface noise while preserving the waveforms that contain information about the source of microseisms. Automated event identification on passive seismic data using only a few receivers is challenging especially when the record lengths span over long durations of time. I introduce an automatic event identification algorithm that is designed specifically for detecting events in passive data acquired with a small number of receivers. I demonstrate that the conventional STA/LTA (Short-term Average/Long-term Average) algorithm is not sufficiently effective in event detection in the common case of low signal-to-noise ratio. With a cross-correlation based method as an extension of the STA/LTA algorithm, even low signal-to-noise events (that were not detectable with conventional STA/LTA) were revealed. Surface microseismic data contains surface-waves (generated primarily from hydraulic fracturing activities) and body-waves in the form of microseismic events. It is challenging to analyze the surface-waves on the recorded data directly because of the randomness of their source and their unknown source signatures. I use seismic interferometry to extract the surface-wave arrivals. Interferometry is a powerful tool to extract waves (including body-wave and surface-waves) that propagate from any receiver in the array (called a pseudo source) to the other receivers across the array. Since most of the noise sources in surface microseismic data lie on the surface, seismic interferometry yields pseudo source gathers dominated by surface-wave energy. The dispersive characteristics of these surface-waves are important properties that can be used to extract information necessary for suppressing these waves. I demonstrate the application of interferometry to surface passive data recorded during the hydraulic fracturing operation of a tight gas reservoir and extract the dispersion properties of surface-waves corresponding to a pseudo-shot gather. Comparison of the dispersion characteristics of the surface waves from the pseudo-shot gather with that of an active shot-gather shows interesting similarities and differences. The dispersion character (e.g. velocity change with frequency) of the fundamental mode was observed to have the same behavior for both the active and passive data. However, for the higher mode surface-waves, the dispersion properties are extracted at different frequency ranges. Conventional noise suppression techniques in passive data are mostly stacking-based that rely on enforcing the amplitude of the signal by stacking the waveforms at the receivers and are unable to preserve the waveforms at the individual receivers necessary for estimating the microseismic source location and source mechanism. Here, I introduce a technique based on the tau - p transform, that effectively identifies and separates microseismic events from surface-wave noise in the tau -p domain. This technique is superior to conventional stacking-based noise suppression techniques, because it preserves the waveforms at individual receivers. Application of this methodology to microseismic events with isotropic and double-couple source mechanism, show substantial improvement in the signal-to-noise ratio. Imaging of the processed field data also show improved imaging of the hypocenter location of the microseismic source. In the case of double-couple source mechanism, I suggest two approaches for unifying the polarities at the receivers, a cross-correlation approach and a semblance-based prediction approach. The semblance-based approach is more effective at unifying the polarities, especially for low signal-to-noise ratio data.
Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin
2013-04-15
There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study" method than using the "Global Trigger Tool". Differences in review methodology, perception of less severe adverse events and context knowledge may explain the observed difference between two expert review teams in the detection of adverse events.
NASA Astrophysics Data System (ADS)
Edwards, A. W.; Blackler, K.; Gill, R. D.; van der Goot, E.; Holm, J.
1990-10-01
Based upon the experience gained with the present soft x-ray data acquisition system, new techniques are being developed which make extensive use of digital signal processors (DSPs). Digital filters make 13 further frequencies available in real time from the input sampling frequency of 200 kHz. In parallel, various algorithms running on further DSPs generate triggers in response to a range of events in the plasma. The sawtooth crash can be detected, for example, with a delay of only 50 μs from the onset of the collapse. The trigger processor interacts with the digital filter boards to ensure data of the appropriate frequency is recorded throughout a plasma discharge. An independent link is used to pass 780 and 24 Hz filtered data to a network of transputers. A full tomographic inversion and display of the 24 Hz data is carried out in real time using this 15 transputer array. The 780 Hz data are stored for immediate detailed playback following the pulse. Such a system could considerably improve the quality of present plasma diagnostic data which is, in general, sampled at one fixed frequency throughout a discharge. Further, it should provide valuable information towards designing diagnostic data acquisition systems for future long pulse operation machines when a high degree of real-time processing will be required, while retaining the ability to detect, record, and analyze events of interest within such long plasma discharges.
Distributed Digital Survey Logbook Built on GeoServer and PostGIS
NASA Astrophysics Data System (ADS)
Jovicic, Aleksandar; Castelli, Ana; Kljajic, Zoran
2013-04-01
Keeping tracks of events that happens during survey (e.g. position and time when instruments goes into the water or come on-board, depths from which samples are taken or notes about equipment malfunctions and repairs) is essential for efficient post-processing and quality control of collected data especially in case of suspicious measurements. Most scientists still using good-old-paper way for such tasks and later transform it into digital form using spreadsheet applications. This approach looks more "safe" (if person is not confident in their computer skills) but in reality it turns to be more error-prone (especially when it comes to position recording and variations of sexagesimal representations or if there are no hints which timezone was used for time recording). As cruises usually involves various teams not always interested to do own measurements at each station, keeping eye on current position is essential, especially if cruise plan is changed (due to bad weather or discovering of some underwater features that requires more attention than originally planned). Also, position is usually displayed only at one monitor (as most GPS receivers provide just serial connectivity and distribution of such signal to multiple clients requires some devices non-wide-spread on computer equipment market) so it can make messy situation in control room when everybody try to write-down current position and time. To overcome all mentioned obstacles Distributed Digital Surevey Logbook is implemented. It is built on Open Geospatial Consortium (OGC) compliant GeoServer, using PostGIS database. It can handle geospatial content (charts and cruise plans), do recording of vessel track and all kind of events that any member of team want to record. As GeoServer allows distribution of position data to unlimited number of clients (from traditional PC's and laptops to tablets and smartphones), it can decrease pressure on control room no matter if all features are used or just as distant display of ship position. If vessel is equipped with Internet link, real-time situation can be distributed to expert on land, who can monitor progress and advise chief-scientist how to overcome issues. Each scientist can setup own pre-defined events, and trigger it by one click, or use free-text button and write-down note. Timestamp of event is recorded and in case that triggering was delayed (e.g. person was occupied with equipment preparation), time-delay modifier is available. Position of event is marked based on recorded timestamp, so all events that happens at single station can be shown on chart. Events can be filtered by contributor, so each team can get view of own stations only. ETA at next station and planned activities there are also shown, so crew can better estimate moment when need to start preparing equipment. Presented solution shows benefits that free software (e.g. GeoServer, PostGIS, OpenLayers, Geotools) produced according to OGC standards, brings to oceanographic community especially in decreasing of development time and providing multi-platform access. Applicability of such solutions is not limited only to on-board operations but can be easily extended to any task involving geospatial data.
NASA Astrophysics Data System (ADS)
Olgin, J. G.; Pennington, D. D.; Webb, N.
2017-12-01
A variety of models have been developed to better understand dust emissions - from initial erosive event to entrainment to transport through the atmosphere. Many of these models have been used to analyze dust emissions by representing atmospheric and surface variables, such as wind and soil moisture respectively, in a numerical model to determine the resulting dust emissions. Pertinent to modeling these variables, there are three important factors influencing emissions: 1) Friction velocity threshold based on wind interactions with dust, 2) horizontal flux (saltation) of dust and 3) vertical dust flux. While all of the existing models incorporate these processes, additional improvements are needed to yield results reflective of recorded data of dust events. Our investigation focuses on explicitly identifying specific land cover (LC) elements unique to the Chihuahan desert that contribute to aerodynamic roughness length (Zo); a main component to dust emission and key to provide results more representative of known dust events in semi-arid regions. These elements will be formulated into computer model inputs by conducting analysis (e.g. geostatistics) on field and satellite data to ascertain core LC characteristics responsible for affecting wind velocities (e.g. wind shadowing effects), which are conducive to dust emissions. This inputs will be used in a modified program using the Weather and Research Forecast model (WRF) to replicate previously recorded dust events. Results from this study will be presented here.
Evaluation of advanced air bag deployment algorithm performance using event data recorders.
Gabler, Hampton C; Hinch, John
2008-10-01
This paper characterizes the field performance of occupant restraint systems designed with advanced air bag features including those specified in the US Federal Motor Vehicle Safety Standard (FMVSS) No. 208 for advanced air bags, through the use of Event Data Recorders (EDRs). Although advanced restraint systems have been extensively tested in the laboratory, we are only beginning to understand the performance of these systems in the field. Because EDRs record many of the inputs to the advanced air bag control module, these devices can provide unique insights into the characteristics of field performance of air bags. The study was based on 164 advanced air bag cases extracted from NASS/CDS 2002-2006 with associated EDR data. In this dataset, advanced driver air bags were observed to deploy with a 50% probability at a longitudinal delta-V of 9 mph for the first stage, and at 26 mph for both inflator stages. In general, advanced air bag performance was as expected, however, the study identified cases of air bag deployments at delta-Vs as low as 3-4 mph, non-deployments at delta-Vs over 26 mph, and possible delayed air bag deployments.
Evaluation of Advanced Air Bag Deployment Algorithm Performance using Event Data Recorders
Gabler, Hampton C.; Hinch, John
2008-01-01
This paper characterizes the field performance of occupant restraint systems designed with advanced air bag features including those specified in the US Federal Motor Vehicle Safety Standard (FMVSS) No. 208 for advanced air bags, through the use of Event Data Recorders (EDRs). Although advanced restraint systems have been extensively tested in the laboratory, we are only beginning to understand the performance of these systems in the field. Because EDRs record many of the inputs to the advanced air bag control module, these devices can provide unique insights into the characteristics of field performance of air bags. The study was based on 164 advanced air bag cases extracted from NASS/CDS 2002-2006 with associated EDR data. In this dataset, advanced driver air bags were observed to deploy with a 50% probability at a longitudinal delta-V of 9 mph for the first stage, and at 26 mph for both inflator stages. In general, advanced air bag performance was as expected, however, the study identified cases of air bag deployments at delta-Vs as low as 3-4 mph, non-deployments at delta-Vs over 26 mph, and possible delayed air bag deployments. PMID:19026234
NASA Astrophysics Data System (ADS)
Maio, C. V.; Donnelly, J. P.; Sullivan, R.; Weidman, C. R.; Sheremet, V.
2014-12-01
The brevity of the instrumental record and lack of detailed historical accounts is a limiting factor in our understanding of the relationship between climate change and the frequency and intensity of extreme storm events. This study applied paleotempestologic and hydrographic methods to identify the mechanisms of storm-induced coarse grain deposition and reconstruct a late Holocene storm record within Waquoit Bay, Massachusetts. Three sediment cores (6.0 m, 8.4 m, and 8.2 m) were collected in 3 m of water using a vibracore system. Grain sizes were measured along core to identify coarse grain anomalies that serve as a proxy for past storm events. An historical age model (1620-2011 AD) was developed based on Pb pollution chronomarkers derived from X-Ray Florescence bulk Pb data, equating to a sedimentation rate of 8-8.3 mm/yr (R2 = 0.99). A long-term (4000 to 275 years before present) sedimentation rate of 1.1-1.4 mm/yr (R2 = 0.89) was calculated based on twenty-four continuous flow atomic mass spectrometry 14C ages of marine bivalves. To determine hydrographic conditions within the embayment during storm events current meters and tide gauges were deployed during Hurricane Irene (2011) which measured a storm surge of 88 cm above mean sea level. The buildup of storm water against the landward shoreline resulted in a measured 10 cm/s seaward moving bottom current capable of transporting coarse sand eroded from the adjacent shoreface into the coring site. Modeled surges for eleven modern and historic storm events ranged in height from 0.37 m (2011) to 3.72 m (1635) above mean high water. The WAQ1, WAQ2, and WAQ3 cores recorded a total of 89, 139, and 137 positive anomalies that exceeded the lower threshold and 15, 34, and 12 that exceeded the upper threshold respectively. Events recorded during the historic period coincide with documented storm events. The mean frequency within the three cores applying the lower threshold was 2.6 events per century, while applying the upper threshold was 0.44 events per century. The study has identified a previously understudied transport mechanism for the formation of storm-induced coarse grain horizons and highlighted some of the challenges to utilizing shallow water embayments as sites for storm reconstructions.
Collapse of Experimental Colloidal Aging using Record Dynamics
NASA Astrophysics Data System (ADS)
Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter
The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michaels, A.F.; Johnson, R.J.; Siegel, D.A.
1993-06-01
This paper compares a recent atmospheric wet deposition record (including all measurable daily rainfall events between October 1988 and June 1991) with concurrent measurements of nitrogen cycling and biomass at the U.S. Joint Global Ocean Flux Study Bermuda Atlantic Time Series Study station. The two data sets, among the most complete synoptic records of atmospheric nitrogen deposition and ocean nitrogen cycling, provide an opportunity to directly assess the importance of nitrogen deposition in the ocean. The results indicate that individual nitrogen wet deposition events are usually small compared to the ambient nitrogen cycle and that only under sustained calm conditionsmore » following large deposition events will nitrogen deposition processes be an important signal for the understanding of ocean biochemistry. 46 refs., 7 figs.« less
Aad, G.; Abbott, B.; Abdallah, J.; ...
2011-03-01
A measurement of the production cross-section for top quark pairs (more » $$t\\bar{t}$$) in pp collisions at √s =7 TeVis presented using data recorded with the ATLAS detector at the Large Hadron Collider. Events are selected in two different topologies: single lepton (electron e or muon μ) with large missing transverse energy and at least four jets, and dilepton (ee, μμ or eμ) with large missing transverse energy and at least two jets. In a data sample of 2.9 pb –1, 37 candidate events are observed in the single-lepton topology and 9 events in the dilepton topology. The corresponding expected backgrounds from non-$$t\\bar{t}$$Standard Model processes are estimated using data-driven methods and determined to be 12.2 ± 3.9 events and 2.5± 0.6 events, respectively.« less
Richard L. Everett; Richard Schellhaas; Pete Ohlson
2000-01-01
Fire scar and stand cohort records were used to estimate the number and timing of fire disturbance events that impacted riparian and adjacent sideslope forests in the Douglas-fir series. Data were gathered from 49 stream segments on 24 separate streams on the east slope of the Washington Cascade Range. Upslope forests had more traceable disturbance events than riparian...
NASA Technical Reports Server (NTRS)
Ziemke, J. R.; Olsen, M. A.; Witte, J. C.; Douglass, A. R.; Strahan, S. E.; Wargan, K.; Liu, X.; Schoeberl, M. R.; Yang, K.; Kaplan, T. B.;
2013-01-01
Measurements from the Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS), both onboard the Aura spacecraft, have been used to produce daily global maps of column and profile ozone since August 2004. Here we compare and evaluate three strategies to obtain daily maps of tropospheric and stratospheric ozone from OMI and MLS measurements: trajectory mapping, direct profile retrieval, and data assimilation. Evaluation is based upon an assessment that includes validation using ozonesondes and comparisons with the Global Modeling Initiative (GMI) chemical transport model (CTM). We investigate applications of the three ozone data products from near-decadal and inter-annual timescales to day-to-day case studies. Zonally averaged inter-annual changes in tropospheric ozone from all of the products in any latitude range are of the order 1-2 Dobson Units while changes (increases) over the 8-year Aura record investigated http://eospso.gsfc.nasa.gov/atbd-category/49 vary approximately 2-4 Dobson Units. It is demonstrated that all of the ozone products can measure and monitor exceptional tropospheric ozone events including major forest fire and pollution transport events. Stratospheric ozone during the Aura record has several anomalous inter-annual events including stratospheric warming split events in the Northern Hemisphere extra-tropics that are well captured using the data assimilation ozone profile product. Data assimilation with continuous daily global coverage and vertical ozone profile information is the best of the three strategies at generating a global tropospheric and stratospheric ozone product for science applications.
Detection of cough signals in continuous audio recordings using hidden Markov models.
Matos, Sergio; Birring, Surinder S; Pavord, Ian D; Evans, David H
2006-06-01
Cough is a common symptom of many respiratory diseases. The evaluation of its intensity and frequency of occurrence could provide valuable clinical information in the assessment of patients with chronic cough. In this paper we propose the use of hidden Markov models (HMMs) to automatically detect cough sounds from continuous ambulatory recordings. The recording system consists of a digital sound recorder and a microphone attached to the patient's chest. The recognition algorithm follows a keyword-spotting approach, with cough sounds representing the keywords. It was trained on 821 min selected from 10 ambulatory recordings, including 2473 manually labeled cough events, and tested on a database of nine recordings from separate patients with a total recording time of 3060 min and comprising 2155 cough events. The average detection rate was 82% at a false alarm rate of seven events/h, when considering only events above an energy threshold relative to each recording's average energy. These results suggest that HMMs can be applied to the detection of cough sounds from ambulatory patients. A postprocessing stage to perform a more detailed analysis on the detected events is under development, and could allow the rejection of some of the incorrectly detected events.
Sherman, Maxwell A; Lee, Shane; Law, Robert; Haegens, Saskia; Thorn, Catherine A; Hämäläinen, Matti S; Moore, Christopher I; Jones, Stephanie R
2016-08-16
Human neocortical 15-29-Hz beta oscillations are strong predictors of perceptual and motor performance. However, the mechanistic origin of beta in vivo is unknown, hindering understanding of its functional role. Combining human magnetoencephalography (MEG), computational modeling, and laminar recordings in animals, we present a new theory that accounts for the origin of spontaneous neocortical beta. In our MEG data, spontaneous beta activity from somatosensory and frontal cortex emerged as noncontinuous beta events typically lasting <150 ms with a stereotypical waveform. Computational modeling uniquely designed to infer the electrical currents underlying these signals showed that beta events could emerge from the integration of nearly synchronous bursts of excitatory synaptic drive targeting proximal and distal dendrites of pyramidal neurons, where the defining feature of a beta event was a strong distal drive that lasted one beta period (∼50 ms). This beta mechanism rigorously accounted for the beta event profiles; several other mechanisms did not. The spatial location of synaptic drive in the model to supragranular and infragranular layers was critical to the emergence of beta events and led to the prediction that beta events should be associated with a specific laminar current profile. Laminar recordings in somatosensory neocortex from anesthetized mice and awake monkeys supported these predictions, suggesting this beta mechanism is conserved across species and recording modalities. These findings make several predictions about optimal states for perceptual and motor performance and guide causal interventions to modulate beta for optimal function.
Thompson, Deborah L; Makvandi, Monear; Baumbach, Joan
2013-02-01
In New Mexico, voluntary submission of central line-associated bloodstream infection (CLABSI) surveillance data via the National Healthcare Safety Network (NHSN) began in July 2008. Validation of CLABSI data is necessary to ensure quality, accuracy, and reliability of surveillance efforts. We conducted a retrospective medical record review of 123 individuals with positive blood cultures who were admitted to adult intensive care units (ICU) at 6 New Mexico hospitals between November 2009 and March 2010. Blinded reviews were conducted independently by pairs of reviewers using standardized data collection instruments. Findings were compared between reviewers and with NHSN data. Discordant cases were reviewed and reconciled with hospital infection preventionists. Initially, 118 individuals were identified for medical record review. Seven ICU CLABSI events were identified by the reviewers. Data submitted to the NHSN revealed 8 ICU CLABSI events, 5 of which had not been identified for medical record review and 3 of which had been determined by reviewers to not be ICU CLABSI cases. Comparison of final case determinations for all 123 individuals with NHSN data resulted in a sensitivity of 66.7%, specificity of 100%, positive predictive value of 100%, and negative predictive value of 96.5% for ICU CLABSI surveillance. There is need for ongoing quality improvement and validation processes to ensure accurate NHSN data. Copyright © 2013 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kämpf, Lucas; Brauer, Achim; Mueller, Philip; Güntner, Andreas; Merz, Bruno
2015-04-01
The relation of changing climate and the occurrence of strong flood events has been controversially debated over the last years. One major limitation in this respect is the temporal extension of instrumental flood time series, rarely exceeding 50-100 years, which is too short to reflect the full range of natural climate variability in a region. Therefore, geoarchives are increasingly explored as natural flood recorders far beyond the range of instrumental flood time series. Annually laminated (varved) lake sediments provide particularly valuable archives since (i) lakes form ideal traps in the landscape continuously recording sediment flux from the catchment and (ii) individual flood events are recorded as detrital layers and can be dated with seasonal precision by varve counting. Despite the great potential of varved lake sediments for reconstructing long flood time series, there are still some confinements with respect to their interpretation due to a lack in understanding processes controlling the formation of detrital layers. For this purpose, we investigated the formation of detrital flood layers in Lake Mondsee (Upper Austria) in great detail by monitoring flood-related sediment flux and comparing detrital layers in sub-recent sediments with river runoff data. Sediment flux at the lake bottom was trapped over a three-year period (2011-2013) at two locations in Lake Mondsee, one located 0.9 km off the main inflow (proximal) and one in a more distal position at a distance of 2.8 km. The monitoring data include 26 floods of different amplitude (max. hourly discharge=10-110 cbm/s) which triggered variable fluxes of catchment sediment to the lake floor (4-760 g/(sqm*d)). The comparison of runoff and sediment data revealed empiric runoff thresholds for triggering significant detrital sediment influx to the proximal (20 cbm/s) and distal lake basin (30 cbm/s) and an exponential relation between runoff amplitude and the amount of deposited sediment. A succession of 20 sub-millimetre to maximum 8 mm thick flood-triggered detrital layers, deposited between 1976 and 2005, was detected in two varved surface sediment cores from the same locations as the sediment traps. Calibration of the detrital layer record with river runoff data revealed empirical thresholds for flood layer deposition. These thresholds are higher than those for trapped sediment flux but, similarly to the trap results, increasing from the proximal (50-60 cbm/s; daily mean=40 cbm/s) to the distal lake basin (80 cbm/s, 2 days>40 cbm/s). Three flood events above the threshold for detrital layer formation in the proximal and one in the distal lake basin were also recorded in the monitoring period. These events resulted in exceptional sediment transfer to the lake of more than 400 g/sqm at both sites, which is therefore interpreted as the minimum sediment amount for producing a visible detrital layer.
NASA Astrophysics Data System (ADS)
Rasmussen, Sune O.; Bigler, Matthias; Blockley, Simon P.; Blunier, Thomas; Buchardt, Susanne L.; Clausen, Henrik B.; Cvijanovic, Ivana; Dahl-Jensen, Dorthe; Johnsen, Sigfus J.; Fischer, Hubertus; Gkinis, Vasileios; Guillevic, Myriam; Hoek, Wim Z.; Lowe, J. John; Pedro, Joel B.; Popp, Trevor; Seierstad, Inger K.; Steffensen, Jørgen Peder; Svensson, Anders M.; Vallelonga, Paul; Vinther, Bo M.; Walker, Mike J. C.; Wheatley, Joe J.; Winstrup, Mai
2014-12-01
Due to their outstanding resolution and well-constrained chronologies, Greenland ice-core records provide a master record of past climatic changes throughout the Last Interglacial-Glacial cycle in the North Atlantic region. As part of the INTIMATE (INTegration of Ice-core, MArine and TErrestrial records) project, protocols have been proposed to ensure consistent and robust correlation between different records of past climate. A key element of these protocols has been the formal definition and ordinal numbering of the sequence of Greenland Stadials (GS) and Greenland Interstadials (GI) within the most recent glacial period. The GS and GI periods are the Greenland expressions of the characteristic Dansgaard-Oeschger events that represent cold and warm phases of the North Atlantic region, respectively. We present here a more detailed and extended GS/GI template for the whole of the Last Glacial period. It is based on a synchronization of the NGRIP, GRIP, and GISP2 ice-core records that allows the parallel analysis of all three records on a common time scale. The boundaries of the GS and GI periods are defined based on a combination of stable-oxygen isotope ratios of the ice (δ18O, reflecting mainly local temperature) and calcium ion concentrations (reflecting mainly atmospheric dust loading) measured in the ice. The data not only resolve the well-known sequence of Dansgaard-Oeschger events that were first defined and numbered in the ice-core records more than two decades ago, but also better resolve a number of short-lived climatic oscillations, some defined here for the first time. Using this revised scheme, we propose a consistent approach for discriminating and naming all the significant abrupt climatic events of the Last Glacial period that are represented in the Greenland ice records. The final product constitutes an extended and better resolved Greenland stratotype sequence, against which other proxy records can be compared and correlated. It also provides a more secure basis for investigating the dynamics and fundamental causes of these climatic perturbations.
After the data breach: Managing the crisis and mitigating the impact.
Brown, Hart S
2016-01-01
Historically, the unauthorised access and theft of information was a tactic used between countries as part of espionage campaigns, during times of conflict as well as for personal and criminal purposes. The consumers of the information were relatively isolated and specific. As information became stored and digitised in larger quantities in the 1980s the ability to access mass amounts of records at one time became possible. The expertise needed to remotely access and exfiltrate the data was not readily available and the number of markets to monetise the data was limited. Over the past ten years, shadow networks have been used by criminals to collaborate on hacking techniques, exchange hacking advice anonymously and commercialise data on the black market. The intersection of these networks along with the unintentional losses of information have resulted in 5,810 data breaches made public since 2005 (comprising some 847,807,830 records) and the velocity of these events is increasing. Organisations must be prepared for a potential breach event to maintain cyber resiliency. Proper management of a breach response can reduce response costs and can serve to mitigate potential reputational losses.
Regional and teleseismic events recorded across the TESZ during POLONAISE'97
NASA Astrophysics Data System (ADS)
Wilde-Piórko, M.; Grad, M.; Polonaise Working Group
1999-12-01
20 Polish short-period three-component stations were continuously operating for three weeks during POLONAISE'97 in the contact zone between Palaeozoic and Precambrian platforms in Poland. The distances between seismometers were about 20 km and the digitization interval was 0.02 s. Besides the shots, a few regional events from the Lubin area and teleseismic events mainly from the SE backazimuth were also recorded. Interpretation of traveltimes for P and S waves for regional events using a simplified LT-7 model of crustal structure for theoretical calculation allowed correction of their origin time. The same model can also explain the traveltime residuals of P waves for teleseismic events. The main features of the division of Poland into two platforms by the Teisseyre-Tornquist tectonic zone (TTZ) is seen both in the shape of residuals of teleseismic phases and in the receiver function. A passive seismic experiment made during POLONAISE'97 as a reconnaissance for future teleseismic tomography experiment TOR-2 gave quite promising results; however, to make a traveltime tomography and receiver function analysis, the duration of data acquisition should be about half a year.
Lawhern, Vernon; Hairston, W David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.
Lawhern, Vernon; Hairston, W. David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169
FORTE Compact Intra-cloud Discharge Detection parameterized by Peak Current
NASA Astrophysics Data System (ADS)
Heavner, M. J.; Suszcynsky, D. M.; Jacobson, A. R.; Heavner, B. D.; Smith, D. A.
2002-12-01
The Los Alamos Sferic Array (EDOT) has recorded over 3.7 million lightning-related fast electric field change data records during April 1 - August 31, 2001 and 2002. The events were detected by three or more stations, allowing for differential-time-of-arrival location determination. The waveforms are characterized with estimated peak currents as well as by event type. Narrow Bipolar Events (NBEs), the VLF/LF signature of Compact Intra-cloud Discharges (CIDs), are generally isolated pulses with identifiable ionospheric reflections, permitting determination of event source altitudes. We briefly review the EDOT characterization of events. The FORTE satellite observes Trans-Ionospheric Pulse Pairs (TIPPs, the VHF satellite signature of CIDs). The subset of coincident EDOT and FORTE CID observations are compared with the total EDOT CID database to characterize the VHF detection efficiency of CIDs. The NBE polarity and altitude are also examined in the context of FORTE TIPP detection. The parameter-dependent detection efficiencies are extrapolated from FORTE orbit to GPS orbit in support of the V-GLASS effort (GPS based global detection of lightning).
Using Satellite Data to Monitor the Impacts of CyanoHAB Events on Drinking Water: A Texas Case Study
Overview of CYAN and it's mission to support the environmental management and public use of U.S. lakes and estuaries by providing a capability of detecting and quantifying algal blooms and related water quality using satellite data records.
Locating Local Earthquakes Using Single 3-Component Broadband Seismological Data
NASA Astrophysics Data System (ADS)
Das, S. B.; Mitra, S.
2015-12-01
We devised a technique to locate local earthquakes using single 3-component broadband seismograph and analyze the factors governing the accuracy of our result. The need for devising such a technique arises in regions of sparse seismic network. In state-of-the-art location algorithms, a minimum of three station recordings are required for obtaining well resolved locations. However, the problem arises when an event is recorded by less than three stations. This may be because of the following reasons: (a) down time of stations in a sparse network; (b) geographically isolated regions with limited logistic support to setup large network; (c) regions of insufficient economy for financing multi-station network and (d) poor signal-to-noise ratio for smaller events at most stations, except the one in its closest vicinity. Our technique provides a workable solution to the above problematic scenarios. However, our methodology is strongly dependent on the velocity model of the region. Our method uses a three step processing: (a) ascertain the back-azimuth of the event from the P-wave particle motion recorded on the horizontal components; (b) estimate the hypocentral distance using the S-P time; and (c) ascertain the emergent angle from the vertical and radial components. Once this is obtained, one can ray-trace through the 1-D velocity model to estimate the hypocentral location. We test our method on synthetic data, which produces results with 99% precision. With observed data, the accuracy of our results are very encouraging. The precision of our results depend on the signal-to-noise ratio (SNR) and choice of the right band-pass filter to isolate the P-wave signal. We used our method on minor aftershocks (3 < mb < 4) of the 2011 Sikkim earthquake using data from the Sikkim Himalayan network. Location of these events highlight the transverse strike-slip structure within the Indian plate, which was observed from source mechanism study of the mainshock and larger aftershocks.
Combining EEG, MIDI, and motion capture techniques for investigating musical performance.
Maidhof, Clemens; Kästner, Torsten; Makkonen, Tommi
2014-03-01
This article describes a setup for the simultaneous recording of electrophysiological data (EEG), musical data (MIDI), and three-dimensional movement data. Previously, each of these three different kinds of measurements, conducted sequentially, has been proven to provide important information about different aspects of music performance as an example of a demanding multisensory motor skill. With the method described here, it is possible to record brain-related activity and movement data simultaneously, with accurate timing resolution and at relatively low costs. EEG and MIDI data were synchronized with a modified version of the FTAP software, sending synchronization signals to the EEG recording device simultaneously with keypress events. Similarly, a motion capture system sent synchronization signals simultaneously with each recorded frame. The setup can be used for studies investigating cognitive and motor processes during music performance and music-like tasks--for example, in the domains of motor control, learning, music therapy, or musical emotions. Thus, this setup offers a promising possibility of a more behaviorally driven analysis of brain activity.
Performance of the ATLAS trigger system in 2015.
Aaboud, M; Aad, G; Abbott, B; Abdallah, J; Abdinov, O; Abeloos, B; Aben, R; AbouZeid, O S; Abraham, N L; Abramowicz, H; Abreu, H; Abreu, R; Abulaiti, Y; Acharya, B S; Adachi, S; Adamczyk, L; Adams, D L; Adelman, J; Adomeit, S; Adye, T; Affolder, A A; Agatonovic-Jovin, T; Aguilar-Saavedra, J A; Ahlen, S P; Ahmadov, F; Aielli, G; Akerstedt, H; Åkesson, T P A; Akimov, A V; Alberghi, G L; Albert, J; Albrand, S; Alconada Verzini, M J; Aleksa, M; Aleksandrov, I N; Alexa, C; Alexander, G; Alexopoulos, T; Alhroob, M; Ali, B; Aliev, M; Alimonti, G; Alison, J; Alkire, S P; Allbrooke, B M M; Allen, B W; Allport, P P; Aloisio, A; Alonso, A; Alonso, F; Alpigiani, C; Alshehri, A A; Alstaty, M; Alvarez Gonzalez, B; Álvarez Piqueras, D; Alviggi, M G; Amadio, B T; Amaral Coutinho, Y; Amelung, C; Amidei, D; Amor Dos Santos, S P; Amorim, A; Amoroso, S; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anders, J K; Anderson, K J; Andreazza, A; Andrei, V; Angelidakis, S; Angelozzi, I; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antel, C; Antonelli, M; Antonov, A; Antrim, D J; Anulli, F; Aoki, M; Aperio Bella, L; Arabidze, G; Arai, Y; Araque, J P; Arce, A T H; Arduh, F A; Arguin, J-F; Argyropoulos, S; Arik, M; Armbruster, A J; Armitage, L J; Arnaez, O; Arnold, H; Arratia, M; Arslan, O; Artamonov, A; Artoni, G; Artz, S; Asai, S; Asbah, N; Ashkenazi, A; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Atkinson, M; Atlay, N B; Augsten, K; Avolio, G; Axen, B; Ayoub, M K; Azuelos, G; Baak, M A; Baas, A E; Baca, M J; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Bagiacchi, P; Bagnaia, P; Bai, Y; Baines, J T; Bajic, M; Baker, O K; Baldin, E M; Balek, P; Balestri, T; Balli, F; Balunas, W K; Banas, E; Banerjee, Sw; Bannoura, A A E; Barak, L; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barisits, M-S; Barklow, T; Barlow, N; Barnes, S L; Barnett, B M; Barnett, R M; Barnovska-Blenessy, Z; Baroncelli, A; Barone, G; Barr, A J; Barranco Navarro, L; Barreiro, F; Barreiro Guimarães da Costa, J; Bartoldus, R; Barton, A E; Bartos, P; Basalaev, A; Bassalat, A; Bates, R L; Batista, S J; Batley, J R; Battaglia, M; Bauce, M; Bauer, F; Bawa, H S; Beacham, J B; Beattie, M D; Beau, T; Beauchemin, P H; Bechtle, P; Beck, H P; Becker, K; Becker, M; Beckingham, M; Becot, C; Beddall, A J; Beddall, A; Bednyakov, V A; Bedognetti, M; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, J K; Bell, A S; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belotskiy, K; Beltramello, O; Belyaev, N L; Benary, O; Benchekroun, D; Bender, M; Bendtz, K; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez, J; Benjamin, D P; Bensinger, J R; Bentvelsen, S; Beresford, L; Beretta, M; Berge, D; Bergeaas Kuutmann, E; Berger, N; Beringer, J; Berlendis, S; Bernard, N R; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertoli, G; Bertolucci, F; Bertram, I A; Bertsche, C; Bertsche, D; Besjes, G J; Bessidskaia Bylund, O; Bessner, M; Besson, N; Betancourt, C; Bethani, A; Bethke, S; Bevan, A J; Bianchi, R M; Bianco, M; Biebel, O; Biedermann, D; Bielski, R; Biesuz, N V; Biglietti, M; Bilbao De Mendizabal, J; Billoud, T R V; Bilokon, H; Bindi, M; Bingul, A; Bini, C; Biondi, S; Bisanz, T; Bjergaard, D M; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blazek, T; Bloch, I; Blocker, C; Blue, A; Blum, W; Blumenschein, U; Blunier, S; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Bock, C; Boehler, M; Boerner, D; Bogaerts, J A; Bogavac, D; Bogdanchikov, A G; Bohm, C; Boisvert, V; Bokan, P; Bold, T; Boldyrev, A S; Bomben, M; Bona, M; Boonekamp, M; Borisov, A; Borissov, G; Bortfeldt, J; Bortoletto, D; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Bossio Sola, J D; Boudreau, J; Bouffard, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Boutle, S K; Boveia, A; Boyd, J; Boyko, I R; Bracinik, J; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Breaden Madden, W D; Brendlinger, K; Brennan, A J; Brenner, L; Brenner, R; Bressler, S; Bristow, T M; Britton, D; Britzger, D; Brochu, F M; Brock, I; Brock, R; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Broughton, J H; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Bruni, A; Bruni, G; Bruni, L S; Brunt, B H; Bruschi, M; Bruscino, N; Bryant, P; Bryngemark, L; Buanes, T; Buat, Q; Buchholz, P; Buckley, A G; Budagov, I A; Buehrer, F; Bugge, M K; Bulekov, O; Bullock, D; Burckhart, H; Burdin, S; Burgard, C D; Burger, A M; Burghgrave, B; Burka, K; Burke, S; Burmeister, I; Burr, J T P; Busato, E; Büscher, D; Büscher, V; Bussey, P; Butler, J M; Buttar, C M; Butterworth, J M; Butti, P; Buttinger, W; Buzatu, A; Buzykaev, A R; Cabrera Urbán, S; Caforio, D; Cairo, V M; Cakir, O; Calace, N; Calafiura, P; Calandri, A; Calderini, G; Calfayan, P; Callea, G; Caloba, L P; Calvente Lopez, S; Calvet, D; Calvet, S; Calvet, T P; Camacho Toro, R; Camarda, S; Camarri, P; Cameron, D; Caminal Armadans, R; Camincher, C; Campana, S; Campanelli, M; Camplani, A; Campoverde, A; Canale, V; Canepa, A; Cano Bret, M; Cantero, J; Cao, T; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capua, M; Carbone, R M; Cardarelli, R; Cardillo, F; Carli, I; Carli, T; Carlino, G; Carlson, B T; Carminati, L; Carney, R M D; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Casolino, M; Casper, D W; Castaneda-Miranda, E; Castelijn, R; Castelli, A; Castillo Gimenez, V; Castro, N F; Catinaccio, A; Catmore, J R; Cattai, A; Caudron, J; Cavaliere, V; Cavallaro, E; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerda Alberich, L; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chan, S K; Chan, Y L; Chang, P; Chapman, J D; Charlton, D G; Chatterjee, A; Chau, C C; Chavez Barajas, C A; Che, S; Cheatham, S; Chegwidden, A; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, S; Chen, S; Chen, X; Chen, Y; Cheng, H C; Cheng, H J; Cheng, Y; Cheplakov, A; Cheremushkina, E; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiarelli, G; Chiodini, G; Chisholm, A S; Chitan, A; Chizhov, M V; Choi, K; Chomont, A R; Chouridou, S; Chow, B K B; Christodoulou, V; Chromek-Burckhart, D; Chudoba, J; Chuinard, A J; Chwastowski, J J; Chytka, L; Ciapetti, G; Ciftci, A K; Cinca, D; Cindro, V; Cioara, I A; Ciocca, C; Ciocio, A; Cirotto, F; Citron, Z H; Citterio, M; Ciubancan, M; Clark, A; Clark, B L; Clark, M R; Clark, P J; Clarke, R N; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Colasurdo, L; Cole, B; Colijn, A P; Collot, J; Colombo, T; Compostella, G; Conde Muiño, P; Coniavitis, E; Connell, S H; Connelly, I A; Consorti, V; Constantinescu, S; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cormier, F; Cormier, K J R; Cornelissen, T; Corradi, M; Corriveau, F; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Cottin, G; Cowan, G; Cox, B E; Cranmer, K; Crawley, S J; Cree, G; Crépé-Renaudin, S; Crescioli, F; Cribbs, W A; Crispin Ortuzar, M; Cristinziani, M; Croft, V; Crosetti, G; Cueto, A; Cuhadar Donszelmann, T; Cummings, J; Curatolo, M; Cúth, J; Czirr, H; Czodrowski, P; D'amen, G; D'Auria, S; D'Onofrio, M; Da Cunha Sargedas De Sousa, M J; Da Via, C; Dabrowski, W; Dado, T; Dai, T; Dale, O; Dallaire, F; Dallapiccola, C; Dam, M; Dandoy, J R; Dang, N P; Daniells, A C; Dann, N S; Danninger, M; Dano Hoffmann, M; Dao, V; Darbo, G; Darmora, S; Dassoulas, J; Dattagupta, A; Davey, W; David, C; Davidek, T; Davies, M; Davison, P; Dawe, E; Dawson, I; De, K; de Asmundis, R; De Benedetti, A; De Castro, S; De Cecco, S; De Groot, N; de Jong, P; De la Torre, H; De Lorenzi, F; De Maria, A; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; Dearnaley, W J; Debbe, R; Debenedetti, C; Dedovich, D V; Dehghanian, N; Deigaard, I; Del Gaudio, M; Del Peso, J; Del Prete, T; Delgove, D; Deliot, F; Delitzsch, C M; Dell'Acqua, A; Dell'Asta, L; Dell'Orso, M; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; DeMarco, D A; Demers, S; Demichev, M; Demilly, A; Denisov, S P; Denysiuk, D; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deterre, C; Dette, K; Deviveiros, P O; Dewhurst, A; Dhaliwal, S; Di Ciaccio, A; Di Ciaccio, L; Di Clemente, W K; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Di Valentino, D; Diaconu, C; Diamond, M; Dias, F A; Diaz, M A; Diehl, E B; Dietrich, J; Díez Cornell, S; Dimitrievska, A; Dingfelder, J; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; Djuvsland, J I; do Vale, M A B; Dobos, D; Dobre, M; Doglioni, C; Dolejsi, J; Dolezal, Z; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dova, M T; Doyle, A T; Drechsler, E; Dris, M; Du, Y; Duarte-Campderros, J; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Dudder, A Chr; Duffield, E M; Duflot, L; Dührssen, M; Dumancic, M; Duncan, A K; Dunford, M; Duran Yildiz, H; Düren, M; Durglishvili, A; Duschinger, D; Dutta, B; Dyndal, M; Eckardt, C; Ecker, K M; Edgar, R C; Edwards, N C; Eifert, T; Eigen, G; Einsweiler, K; Ekelof, T; El Kacimi, M; Ellajosyula, V; Ellert, M; Elles, S; Ellinghaus, F; Elliot, A A; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Ennis, J S; Erdmann, J; Ereditato, A; Ernis, G; Ernst, J; Ernst, M; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Esposito, B; Etienvre, A I; Etzion, E; Evans, H; Ezhilov, A; Ezzi, M; Fabbri, F; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Falla, R J; Faltova, J; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farina, C; Farina, E M; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Faucci Giannelli, M; Favareto, A; Fawcett, W J; Fayard, L; Fedin, O L; Fedorko, W; Feigl, S; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Feremenga, L; Fernandez Martinez, P; Fernandez Perez, S; Ferrando, J; Ferrari, A; Ferrari, P; Ferrari, R; Ferreira de Lima, D E; Ferrer, A; Ferrere, D; Ferretti, C; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Fischer, A; Fischer, C; Fischer, J; Fisher, W C; Flaschel, N; Fleck, I; Fleischmann, P; Fletcher, G T; Fletcher, R R M; Flick, T; Flierl, B M; Flores Castillo, L R; Flowerdew, M J; Forcolin, G T; Formica, A; Forti, A; Foster, A G; Fournier, D; Fox, H; Fracchia, S; Francavilla, P; Franchini, M; Francis, D; Franconi, L; Franklin, M; Frate, M; Fraternali, M; Freeborn, D; Fressard-Batraneanu, S M; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fusayasu, T; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gach, G P; Gadatsch, S; Gagliardi, G; Gagnon, L G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Ganguly, S; Gao, J; Gao, Y; Gao, Y S; Garay Walls, F M; García, C; García Navarro, J E; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Gascon Bravo, A; Gasnikova, K; Gatti, C; Gaudiello, A; Gaudio, G; Gauthier, L; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Gecse, Z; Gee, C N P; Geich-Gimbel, Ch; Geisen, M; Geisler, M P; Gellerstedt, K; Gemme, C; Genest, M H; Geng, C; Gentile, S; Gentsos, C; George, S; Gerbaudo, D; Gershon, A; Ghasemi, S; Ghneimat, M; Giacobbe, B; Giagu, S; Giannetti, P; Gibson, S M; Gignac, M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gilles, G; Gingrich, D M; Giokaris, N; Giordani, M P; Giorgi, F M; Giraud, P F; Giromini, P; Giugni, D; Giuli, F; Giuliani, C; Giulini, M; Gjelsten, B K; Gkaitatzis, S; Gkialas, I; Gkougkousis, E L; Gladilin, L K; Glasman, C; Glatzer, J; Glaysher, P C F; Glazov, A; Goblirsch-Kolb, M; Godlewski, J; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, G; Gonella, L; Gongadze, A; González de la Hoz, S; Gonzalez-Sevilla, S; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Goudet, C R; Goujdami, D; Goussiou, A G; Govender, N; Gozani, E; Graber, L; Grabowska-Bold, I; Gradin, P O J; Grafström, P; Gramling, J; Gramstad, E; Grancagnolo, S; Gratchev, V; Gravila, P M; Gray, H M; Graziani, E; Greenwood, Z D; Grefe, C; Gregersen, K; Gregor, I M; Grenier, P; Grevtsov, K; Griffiths, J; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grivaz, J-F; Groh, S; Gross, E; Grosse-Knetter, J; Grossi, G C; Grout, Z J; Guan, L; Guan, W; Guenther, J; Guescini, F; Guest, D; Gueta, O; Gui, B; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Guo, J; Guo, Y; Gupta, R; Gupta, S; Gustavino, G; Gutierrez, P; Gutierrez Ortiz, N G; Gutschow, C; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Haddad, N; Hadef, A; Hageböck, S; Hagihara, M; Hajduk, Z; Hakobyan, H; Haleem, M; Haley, J; Halladjian, G; Hallewell, G D; Hamacher, K; Hamal, P; Hamano, K; Hamilton, A; Hamity, G N; Hamnett, P G; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Haney, B; Hanke, P; Hanna, R; Hansen, J B; Hansen, J D; Hansen, M C; Hansen, P H; Hara, K; Hard, A S; Harenberg, T; Hariri, F; Harkusha, S; Harrington, R D; Harrison, P F; Hartjes, F; Hartmann, N M; Hasegawa, M; Hasegawa, Y; Hasib, A; Hassani, S; Haug, S; Hauser, R; Hauswald, L; Havranek, M; Hawkes, C M; Hawkings, R J; Hayakawa, D; Hayden, D; Hays, C P; Hays, J M; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heim, T; Heinemann, B; Heinrich, J J; Heinrich, L; Heinz, C; Hejbal, J; Helary, L; Hellman, S; Helsens, C; Henderson, J; Henderson, R C W; Heng, Y; Henkelmann, S; Henriques Correia, A M; Henrot-Versille, S; Herbert, G H; Herde, H; Herget, V; Hernández Jiménez, Y; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hetherly, J W; Higón-Rodriguez, E; Hill, E; Hill, J C; Hiller, K H; Hillier, S J; Hinchliffe, I; Hines, E; Hirose, M; Hirschbuehl, D; Hoad, X; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoenig, F; Hohn, D; Holmes, T R; Homann, M; Honda, T; Hong, T M; Hooberman, B H; Hopkins, W H; Horii, Y; Horton, A J; Hostachy, J-Y; Hou, S; Hoummada, A; Howarth, J; Hoya, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hrynevich, A; Hsu, P J; Hsu, S-C; Hu, Q; Hu, S; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Huo, P; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Ideal, E; Idrissi, Z; Iengo, P; Igonkina, O; Iizawa, T; Ikai, T; Ikegami, Y; Ikeno, M; Ilchenko, Y; Iliadis, D; Ilic, N; Introzzi, G; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Ishijima, N; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Ito, F; Iturbe Ponce, J M; Iuppa, R; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jabbar, S; Jackson, B; Jackson, P; Jain, V; Jakobi, K B; Jakobs, K; Jakobsen, S; Jakoubek, T; Jamin, D O; Jana, D K; Jansky, R; Janssen, J; Janus, M; Janus, P A; Jarlskog, G; Javadov, N; Javůrek, T; Jeanneau, F; Jeanty, L; Jejelava, J; Jeng, G-Y; Jennens, D; Jenni, P; Jeske, C; Jézéquel, S; Ji, H; Jia, J; Jiang, H; Jiang, Y; Jiang, Z; Jiggins, S; Jimenez Pena, J; Jin, S; Jinaru, A; Jinnouchi, O; Jivan, H; Johansson, P; Johns, K A; Johnson, W J; Jon-And, K; Jones, G; Jones, R W L; Jones, S; Jones, T J; Jongmanns, J; Jorge, P M; Jovicevic, J; Ju, X; Juste Rozas, A; Köhler, M K; Kaczmarska, A; Kado, M; Kagan, H; Kagan, M; Kahn, S J; Kaji, T; Kajomovitz, E; Kalderon, C W; Kaluza, A; Kama, S; Kamenshchikov, A; Kanaya, N; Kaneti, S; Kanjir, L; Kantserov, V A; Kanzaki, J; Kaplan, B; Kaplan, L S; Kapliy, A; Kar, D; Karakostas, K; Karamaoun, A; Karastathis, N; Kareem, M J; Karentzos, E; Karnevskiy, M; Karpov, S N; Karpova, Z M; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kasahara, K; Kashif, L; Kass, R D; Kastanas, A; Kataoka, Y; Kato, C; Katre, A; Katzy, J; Kawade, K; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazanin, V F; Keeler, R; Kehoe, R; Keller, J S; Kempster, J J; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Keyes, R A; Khader, M; Khalil-Zada, F; Khanov, A; Kharlamov, A G; Kharlamova, T; Khoo, T J; Khovanskiy, V; Khramov, E; Khubua, J; Kido, S; Kilby, C R; Kim, H Y; Kim, S H; Kim, Y K; Kimura, N; Kind, O M; King, B T; King, M; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kiss, F; Kiuchi, K; Kivernyk, O; Kladiva, E; Klein, M H; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klioutchnikova, T; Kluge, E-E; Kluit, P; Kluth, S; Knapik, J; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, A; Kobayashi, D; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koffas, T; Koffeman, E; Köhler, N M; Koi, T; Kolanoski, H; Kolb, M; Koletsou, I; Komar, A A; Komori, Y; Kondo, T; Kondrashova, N; Köneke, K; König, A C; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Kortner, O; Kortner, S; Kosek, T; Kostyukhin, V V; Kotwal, A; Koulouris, A; Kourkoumeli-Charalampidi, A; Kourkoumelis, C; Kouskoura, V; Kowalewska, A B; Kowalewski, R; Kowalski, T Z; Kozakai, C; Kozanecki, W; Kozhin, A S; Kramarenko, V A; Kramberger, G; Krasnopevtsev, D; Krasny, M W; Krasznahorkay, A; Kravchenko, A; Kretz, M; Kretzschmar, J; Kreutzfeldt, K; Krieger, P; Krizka, K; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Krumnack, N; Kruse, M C; Kruskal, M; Kubota, T; Kucuk, H; Kuday, S; Kuechler, J T; Kuehn, S; Kugel, A; Kuger, F; Kuhl, T; Kukhtin, V; Kukla, R; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunigo, T; Kupco, A; Kurashige, H; Kurchaninov, L L; Kurochkin, Y A; Kurth, M G; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwan, T; Kyriazopoulos, D; La Rosa, A; La Rosa Navarro, J L; Rotonda, L La; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Lammers, S; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lanfermann, M C; Lang, V S; Lange, J C; Lankford, A J; Lanni, F; Lantzsch, K; Lanza, A; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Lasagni Manghi, F; Lassnig, M; Laurelli, P; Lavrijsen, W; Law, A T; Laycock, P; Lazovich, T; Lazzaroni, M; Le, B; Le Dortz, O; Le Guirriec, E; Le Quilleuc, E P; LeBlanc, M; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, S C; Lee, L; Lefebvre, B; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Lehmann Miotto, G; Lei, X; Leight, W A; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Leney, K J C; Lenz, T; Lenzi, B; Leone, R; Leone, S; Leonidopoulos, C; Leontsinis, S; Lerner, G; Leroy, C; Lesage, A A J; Lester, C G; Levchenko, M; Levêque, J; Levin, D; Levinson, L J; Levy, M; Lewis, D; Leyton, M; Li, B; Li, C; Li, H; Li, L; Li, L; Li, Q; Li, S; Li, X; Li, Y; Liang, Z; Liberti, B; Liblong, A; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limosani, A; Lin, S C; Lin, T H; Lindquist, B E; Lionti, A E; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, H; Liu, H; Liu, J; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, Y L; Liu, Y; Livan, M; Lleres, A; Llorente Merino, J; Lloyd, S L; Lo Sterzo, F; Lobodzinska, E M; Loch, P; Loebinger, F K; Loew, K M; Loginov, A; Lohse, T; Lohwasser, K; Lokajicek, M; Long, B A; Long, J D; Long, R E; Longo, L; Looper, K A; Lopez Lopez, J A; Lopez Mateos, D; Lopez Paredes, B; Lopez Paz, I; Lopez Solis, A; Lorenz, J; Lorenzo Martinez, N; Losada, M; Lösel, P J; Lou, X; Lounis, A; Love, J; Love, P A; Lu, H; Lu, N; Lubatti, H J; Luci, C; Lucotte, A; Luedtke, C; Luehring, F; Lukas, W; Luminari, L; Lundberg, O; Lund-Jensen, B; Luzi, P M; Lynn, D; Lysak, R; Lytken, E; Lyubushkin, V; Ma, H; Ma, L L; Ma, Y; Maccarrone, G; Macchiolo, A; Macdonald, C M; Maček, B; Machado Miguens, J; Madaffari, D; Madar, R; Maddocks, H J; Mader, W F; Madsen, A; Maeda, J; Maeland, S; Maeno, T; Maevskiy, A; Magradze, E; Mahlstedt, J; Maiani, C; Maidantchik, C; Maier, A A; Maier, T; Maio, A; Majewski, S; Makida, Y; Makovec, N; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Malone, C; Maltezos, S; Malyukov, S; Mamuzic, J; Mancini, G; Mandelli, L; Mandić, I; Maneira, J; Manhaes de Andrade Filho, L; Manjarres Ramos, J; Mann, A; Manousos, A; Mansoulie, B; Mansour, J D; Mantifel, R; Mantoani, M; Manzoni, S; Mapelli, L; Marceca, G; March, L; Marchiori, G; Marcisovsky, M; Marjanovic, M; Marley, D E; Marroquim, F; Marsden, S P; Marshall, Z; Marti-Garcia, S; Martin, B; Martin, T A; Martin, V J; Martin Dit Latour, B; Martinez, M; Martinez Outschoorn, V I; Martin-Haugh, S; Martoiu, V S; Martyniuk, A C; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massa, L; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Mättig, P; Mattmann, J; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Maznas, I; Mazza, S M; Mc Fadden, N C; Mc Goldrick, G; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McClymont, L I; McDonald, E F; Mcfayden, J A; Mchedlidze, G; McMahon, S J; McNamara, P C; McPherson, R A; Medinnis, M; Meehan, S; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Melini, D; Mellado Garcia, B R; Melo, M; Meloni, F; Menary, S B; Meng, L; Meng, X T; Mengarelli, A; Menke, S; Meoni, E; Mergelmeyer, S; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer Zu Theenhausen, H; Miano, F; Middleton, R P; Miglioranzi, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Milesi, M; Milic, A; Miller, D W; Mills, C; Milov, A; Milstead, D A; Minaenko, A A; Minami, Y; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Minegishi, Y; Ming, Y; Mir, L M; Mistry, K P; Mitani, T; Mitrevski, J; Mitsou, V A; Miucci, A; Miyagawa, P S; Mizukami, A; Mjörnmark, J U; Mlynarikova, M; Moa, T; Mochizuki, K; Mogg, P; Mohapatra, S; Molander, S; Moles-Valls, R; Monden, R; Mondragon, M C; Mönig, K; Monk, J; Monnier, E; Montalbano, A; Montejo Berlingen, J; Monticelli, F; Monzani, S; Moore, R W; Morange, N; Moreno, D; Moreno Llácer, M; Morettini, P; Morgenstern, S; Mori, D; Mori, T; Morii, M; Morinaga, M; Morisbak, V; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Mortensen, S S; Morvaj, L; Moschovakos, P; Mosidze, M; Moss, H J; Moss, J; Motohashi, K; Mount, R; Mountricha, E; Moyse, E J W; Muanza, S; Mudd, R D; Mueller, F; Mueller, J; Mueller, R S P; Mueller, T; Muenstermann, D; Mullen, P; Mullier, G A; Munoz Sanchez, F J; Murillo Quijada, J A; Murray, W J; Musheghyan, H; Muškinja, M; Myagkov, A G; Myska, M; Nachman, B P; Nackenhorst, O; Nagai, K; Nagai, R; Nagano, K; Nagasaka, Y; Nagata, K; Nagel, M; Nagy, E; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Naranjo Garcia, R F; Narayan, R; Narrias Villar, D I; Naryshkin, I; Naumann, T; Navarro, G; Nayyar, R; Neal, H A; Nechaeva, P Yu; Neep, T J; Negri, A; Negrini, M; Nektarijevic, S; Nellist, C; Nelson, A; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neves, R M; Nevski, P; Newman, P R; Nguyen, D H; Nguyen Manh, T; Nickerson, R B; Nicolaidou, R; Nielsen, J; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolopoulos, K; Nilsen, J K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nomachi, M; Nomidis, I; Nooney, T; Norberg, S; Nordberg, M; Norjoharuddeen, N; Novgorodova, O; Nowak, S; Nozaki, M; Nozka, L; Ntekas, K; Nurse, E; Nuti, F; O'grady, F; O'Neil, D C; O'Rourke, A A; O'Shea, V; Oakham, F G; Oberlack, H; Obermann, T; Ocariz, J; Ochi, A; Ochoa, I; Ochoa-Ricoux, J P; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohman, H; Oide, H; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Oleiro Seabra, L F; Olivares Pino, S A; Damazio, D Oliveira; Olszewski, A; Olszowska, J; Onofre, A; Onogi, K; Onyisi, P U E; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Orr, R S; Osculati, B; Ospanov, R; Otero Y Garzon, G; Otono, H; Ouchrif, M; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Owen, M; Owen, R E; Ozcan, V E; Ozturk, N; Pachal, K; Pacheco Pages, A; Pacheco Rodriguez, L; Padilla Aranda, C; Pagáčová, M; Pagan Griso, S; Paganini, M; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palazzo, S; Palestini, S; Palka, M; Pallin, D; St Panagiotopoulou, E; Panagoulias, I; Pandini, C E; Panduro Vazquez, J G; Pani, P; Panitkin, S; Pantea, D; Paolozzi, L; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Paredes Hernandez, D; Parker, A J; Parker, M A; Parker, K A; Parodi, F; Parsons, J A; Parzefall, U; Pascuzzi, V R; Pasqualucci, E; Passaggio, S; Pastore, Fr; Pásztor, G; Pataraia, S; Pater, J R; Pauly, T; Pearce, J; Pearson, B; Pedersen, L E; Pedersen, M; Pedraza Lopez, S; Pedro, R; Peleganchuk, S V; Penc, O; Peng, C; Peng, H; Penwell, J; Peralva, B S; Perego, M M; Perepelitsa, D V; Perez Codina, E; Perini, L; Pernegger, H; Perrella, S; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petroff, P; Petrolo, E; Petrov, M; Petrucci, F; Pettersson, N E; Peyaud, A; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Pickering, M A; Piegaia, R; Pilcher, J E; Pilkington, A D; Pin, A W J; Pinamonti, M; Pinfold, J L; Pingel, A; Pires, S; Pirumov, H; Pitt, M; Plazak, L; Pleier, M-A; Pleskot, V; Plotnikova, E; Pluth, D; Poettgen, R; Poggioli, L; Pohl, D; Polesello, G; Poley, A; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Poppleton, A; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Pozo Astigarraga, M E; Pralavorio, P; Pranko, A; Prell, S; Price, D; Price, L E; Primavera, M; Prince, S; Prokofiev, K; Prokoshin, F; Protopopescu, S; Proudfoot, J; Przybycien, M; Puddu, D; Purohit, M; Puzo, P; Qian, J; Qin, G; Qin, Y; Quadt, A; Quayle, W B; Queitsch-Maitland, M; Quilty, D; Raddum, S; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Rados, P; Ragusa, F; Rahal, G; Raine, J A; Rajagopalan, S; Rammensee, M; Rangel-Smith, C; Ratti, M G; Rauch, D M; Rauscher, F; Rave, S; Ravenscroft, T; Ravinovich, I; Raymond, M; Read, A L; Readioff, N P; Reale, M; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reed, R G; Reeves, K; Rehnisch, L; Reichert, J; Reiss, A; Rembser, C; Ren, H; Rescigno, M; Resconi, S; Rezanova, O L; Reznicek, P; Rezvani, R; Richter, R; Richter, S; Richter-Was, E; Ricken, O; Ridel, M; Rieck, P; Riegel, C J; Rieger, J; Rifki, O; Rijssenbeek, M; Rimoldi, A; Rimoldi, M; Rinaldi, L; Ristić, B; Ritsch, E; Riu, I; Rizatdinova, F; Rizvi, E; Rizzi, C; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Roda, C; Rodina, Y; Rodriguez Perez, A; Rodriguez Rodriguez, D; Roe, S; Rogan, C S; Røhne, O; Roloff, J; Romaniouk, A; Romano, M; Romano Saez, S M; Romero Adam, E; Rompotis, N; Ronzani, M; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, P; Rosien, N-A; Rossetti, V; Rossi, E; Rossi, L P; Rosten, J H N; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rurikova, Z; Rusakovich, N A; Ruschke, A; Russell, H L; Rutherfoord, J P; Ruthmann, N; Ryabov, Y F; Rybar, M; Rybkin, G; Ryu, S; Ryzhov, A; Rzehorz, G F; Saavedra, A F; Sabato, G; Sacerdoti, S; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Saha, P; Sahinsoy, M; Saimpert, M; Saito, T; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Salazar Loyola, J E; Salek, D; Sales De Bruin, P H; Salihagic, D; Salnikov, A; Salt, J; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sammel, D; Sampsonidis, D; Sánchez, J; Sanchez Martinez, V; Sanchez Pineda, A; Sandaker, H; Sandbach, R L; Sandhoff, M; Sandoval, C; Sankey, D P C; Sannino, M; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Santoyo Castillo, I; Sapp, K; Sapronov, A; Saraiva, J G; Sarrazin, B; Sasaki, O; Sato, K; Sauvan, E; Savage, G; Savard, P; Savic, N; Sawyer, C; Sawyer, L; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Scarfone, V; Schaarschmidt, J; Schacht, P; Schachtner, B M; Schaefer, D; Schaefer, L; Schaefer, R; Schaeffer, J; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Schiavi, C; Schier, S; Schillo, C; Schioppa, M; Schlenker, S; Schmidt-Sommerfeld, K R; Schmieden, K; Schmitt, C; Schmitt, S; Schmitz, S; Schneider, B; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schopf, E; Schott, M; Schouwenberg, J F P; Schovancova, J; Schramm, S; Schreyer, M; Schuh, N; Schulte, A; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwartzman, A; Schwarz, T A; Schweiger, H; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Sciolla, G; Scuri, F; Scutti, F; Searcy, J; Seema, P; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekhon, K; Sekula, S J; Seliverstov, D M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Sessa, M; Seuster, R; Severini, H; Sfiligoj, T; Sforza, F; Sfyrla, A; Shabalina, E; Shaikh, N W; Shan, L Y; Shang, R; Shank, J T; Shapiro, M; Shatalov, P B; Shaw, K; Shaw, S M; Shcherbakova, A; Shehu, C Y; Sherwood, P; Shi, L; Shimizu, S; Shimmin, C O; Shimojima, M; Shirabe, S; Shiyakova, M; Shmeleva, A; Shoaleh Saadi, D; Shochet, M J; Shojaii, S; Shope, D R; Shrestha, S; Shulga, E; Shupe, M A; Sicho, P; Sickles, A M; Sidebo, P E; Sideras Haddad, E; Sidiropoulou, O; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silva, J; Silverstein, S B; Simak, V; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simon, D; Simon, M; Sinervo, P; Sinev, N B; Sioli, M; Siragusa, G; Sivoklokov, S Yu; Sjölin, J; Skinner, M B; Skottowe, H P; Skubic, P; Slater, M; Slavicek, T; Slawinska, M; Sliwa, K; Slovak, R; Smakhtin, V; Smart, B H; Smestad, L; Smiesko, J; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, J W; Smith, M N K; Smith, R W; Smizanska, M; Smolek, K; Snesarev, A A; Snyder, I M; Snyder, S; Sobie, R; Socher, F; Soffer, A; Soh, D A; Sokhrannyi, G; Solans Sanchez, C A; Solar, M; Soldatov, E Yu; Soldevila, U; Solodkov, A A; Soloshenko, A; Solovyanov, O V; Solovyev, V; Sommer, P; Son, H; Song, H Y; Sood, A; Sopczak, A; Sopko, V; Sorin, V; Sosa, D; Sotiropoulou, C L; Soualah, R; Soukharev, A M; South, D; Sowden, B C; Spagnolo, S; Spalla, M; Spangenberg, M; Spanò, F; Sperlich, D; Spettel, F; Spieker, T M; Spighi, R; Spigo, G; Spiller, L A; Spousta, M; St Denis, R D; Stabile, A; Stamen, R; Stamm, S; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, G H; Stark, J; Staroba, P; Starovoitov, P; Stärz, S; Staszewski, R; Steinberg, P; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoicea, G; Stolte, P; Stonjek, S; Stradling, A R; Straessner, A; Stramaglia, M E; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Strubig, A; Stucci, S A; Stugu, B; Styles, N A; Su, D; Su, J; Suchek, S; Sugaya, Y; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, S; Sun, X; Sundermann, J E; Suruliz, K; Suster, C J E; Sutton, M R; Suzuki, S; Svatos, M; Swiatlowski, M; Swift, S P; Sykora, I; Sykora, T; Ta, D; Taccini, C; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takai, H; Takashima, R; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tan, K G; Tanaka, J; Tanaka, M; Tanaka, R; Tanaka, S; Tanioka, R; Tannenwald, B B; Tapia Araya, S; Tapprogge, S; Tarem, S; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Tavares Delgado, A; Tayalati, Y; Taylor, A C; Taylor, G N; Taylor, P T E; Taylor, W; Teischinger, F A; Teixeira-Dias, P; Temming, K K; Temple, D; Ten Kate, H; Teng, P K; Teoh, J J; Tepel, F; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Theveneaux-Pelzer, T; Thomas, J P; Thomas-Wilsker, J; Thompson, P D; Thompson, A S; Thomsen, L A; Thomson, E; Tibbetts, M J; Ticse Torres, R E; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tipton, P; Tisserant, S; Todome, K; Todorov, T; Todorova-Nova, S; Tojo, J; Tokár, S; Tokushuku, K; Tolley, E; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Tong, B; Tornambe, P; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Trefzger, T; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Trischuk, W; Trocmé, B; Trofymov, A; Troncon, C; Trottier-McDonald, M; Trovatelli, M; Truong, L; Trzebinski, M; Trzupek, A; Tseng, J C-L; Tsiareshka, P V; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsui, K M; Tsukerman, I I; Tsulaia, V; Tsuno, S; Tsybychev, D; Tu, Y; Tudorache, A; Tudorache, V; Tulbure, T T; Tuna, A N; Tupputi, S A; Turchikhin, S; Turgeman, D; Turk Cakir, I; Turra, R; Tuts, P M; Ucchielli, G; Ueda, I; Ughetto, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Unverdorben, C; Urban, J; Urquijo, P; Urrejola, P; Usai, G; Usui, J; Vacavant, L; Vacek, V; Vachon, B; Valderanis, C; Valdes Santurio, E; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Valls Ferrer, J A; Van Den Wollenberg, W; Van Der Deijl, P C; van der Graaf, H; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vanguri, R; Vaniachine, A; Vankov, P; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vasquez, J G; Vasquez, G A; Vazeille, F; Vazquez Schroeder, T; Veatch, J; Veeraraghavan, V; Veloce, L M; Veloso, F; Veneziano, S; Ventura, A; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Vigani, L; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinogradov, V B; Vittori, C; Vivarelli, I; Vlachos, S; Vlasak, M; Vogel, M; Vokac, P; Volpi, G; Volpi, M; von der Schmitt, H; von Toerne, E; Vorobel, V; Vorobev, K; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vuillermet, R; Vukotic, I; Wagner, P; Wagner, W; Wahlberg, H; Wahrmund, S; Wakabayashi, J; Walder, J; Walker, R; Walkowiak, W; Wallangen, V; Wang, C; Wang, C; Wang, F; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, W; Wanotayaroj, C; Warburton, A; Ward, C P; Wardrope, D R; Washbrook, A; Watkins, P M; Watson, A T; Watson, M F; Watts, G; Watts, S; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Weber, S A; Webster, J S; Weidberg, A R; Weinert, B; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wengler, T; Wenig, S; Wermes, N; Werner, M D; Werner, P; Wessels, M; Wetter, J; Whalen, K; Whallon, N L; Wharton, A M; White, A; White, M J; White, R; Whiteson, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wiglesworth, C; Wiik-Fuchs, L A M; Wildauer, A; Wilk, F; Wilkens, H G; Williams, H H; Williams, S; Willis, C; Willocq, S; Wilson, J A; Wingerter-Seez, I; Winklmeier, F; Winston, O J; Winter, B T; Wittgen, M; Wolf, T M H; Wolff, R; Wolter, M W; Wolters, H; Worm, S D; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wu, M; Wu, M; Wu, S L; Wu, X; Wu, Y; Wyatt, T R; Wynne, B M; Xella, S; Xi, Z; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yamaguchi, D; Yamaguchi, Y; Yamamoto, A; Yamamoto, S; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, Y; Yang, Z; Yao, W-M; Yap, Y C; Yasu, Y; Yatsenko, E; Yau Wong, K H; Ye, J; Ye, S; Yeletskikh, I; Yildirim, E; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J M; Yu, J; Yuan, L; Yuen, S P Y; Yusuff, I; Zabinski, B; Zacharis, G; Zaidan, R; Zaitsev, A M; Zakharchuk, N; Zalieckas, J; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zeitnitz, C; Zeman, M; Zemla, A; Zeng, J C; Zeng, Q; Zenin, O; Ženiš, T; Zerwas, D; Zhang, D; Zhang, F; Zhang, G; Zhang, H; Zhang, J; Zhang, L; Zhang, L; Zhang, M; Zhang, R; Zhang, R; Zhang, X; Zhang, Z; Zhao, X; Zhao, Y; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, C; Zhou, L; Zhou, L; Zhou, M; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zhukov, K; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, S; Zinonos, Z; Zinser, M; Ziolkowski, M; Živković, L; Zobernig, G; Zoccoli, A; Zur Nedden, M; Zwalinski, L
2017-01-01
During 2015 the ATLAS experiment recorded [Formula: see text] of proton-proton collision data at a centre-of-mass energy of [Formula: see text]. The ATLAS trigger system is a crucial component of the experiment, responsible for selecting events of interest at a recording rate of approximately 1 kHz from up to 40 MHz of collisions. This paper presents a short overview of the changes to the trigger and data acquisition systems during the first long shutdown of the LHC and shows the performance of the trigger system and its components based on the 2015 proton-proton collision data.
Performance of the ATLAS trigger system in 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaboud, M.; Aad, G.; Abbott, B.
During 2015 the ATLAS experiment recorded 3.8fb –1 of proton–proton collision data at a centre-of-mass energy of 13TeV. The ATLAS trigger system is a crucial component of the experiment, responsible for selecting events of interest at a recording rate of approximately 1 kHz from up to 40 MHz of collisions. This paper presents a short overview of the changes to the trigger and data acquisition systems during the first long shutdown of the LHC and shows the performance of the trigger system and its components based on the 2015 proton–proton collision data.
Performance of the ATLAS trigger system in 2015
Aaboud, M.; Aad, G.; Abbott, B.; ...
2017-05-18
During 2015 the ATLAS experiment recorded 3.8fb –1 of proton–proton collision data at a centre-of-mass energy of 13TeV. The ATLAS trigger system is a crucial component of the experiment, responsible for selecting events of interest at a recording rate of approximately 1 kHz from up to 40 MHz of collisions. This paper presents a short overview of the changes to the trigger and data acquisition systems during the first long shutdown of the LHC and shows the performance of the trigger system and its components based on the 2015 proton–proton collision data.
Zanos, Stavros; Richardson, Andrew G.; Shupe, Larry; Miles, Frank P.; Fetz, Eberhard E.
2011-01-01
The Neurochip-2 is a second generation, battery-powered device for neural recording and stimulating that is small enough to be carried in a chamber on a monkey’s head. It has three recording channels, with user-adjustable gains, filters, and sampling rates, that can be optimized for recording single unit activity, local field potentials, electrocorticography, electromyography, arm acceleration, etc. Recorded data are stored on a removable, flash memory card. The Neurochip-2 also has three separate stimulation channels. Two “programmable-system-on-chips” (PSoCs) control the data acquisition and stimulus output. The PSoCs permit flexible real-time processing of the recorded data, such as digital filtering and time-amplitude window discrimination. The PSoCs can be programmed to deliver stimulation contingent on neural events or deliver preprogrammed stimuli. Access pins to the microcontroller are also available to connect external devices, such as accelerometers. The Neurochip-2 can record and stimulate autonomously for up to several days in freely behaving monkeys, enabling a wide range of novel neurophysiological and neuroengineering experiments. PMID:21632309
NASA Astrophysics Data System (ADS)
Varner, R. K.; Palace, M. W.; Lennartz, J. M.; Crill, P. M.; Wik, M.; Amante, J.; Dorich, C.; Harden, J. W.; Ewing, S. A.; Turetsky, M. R.
2011-12-01
Knowledge of the magnitude and frequency of methane release through ebullition (bubbling) in water saturated ecosystems such as bogs, fens and lakes is important to both the atmospheric and ecosystems science community. The controls on episodic bubble releases must be identified in order to understand the response of these ecosystems to future climate forcing. We have developed and field tested an inexpensive array of sampling/monitoring instruments to identify the frequency and magnitude of bubbling events which allows us to correlate bubble data with potential drivers such as changes in hydrostatic pressure, wind and temperature. A prototype ebullition sensor has been developed and field tested at Sallie's Fen in New Hampshire, USA. The instrument consists of a nested, inverted funnel design with a hydrophone for detecting bubbles rising through the peat, that hit the microphone. The design also offers a way to sample the gases collected from the funnels to determine the concentration of CH4. Laboratory calibration of the instrument resulted in an equation that relates frequency of bubbles hitting the microphone with bubble volume. After calibration in the laboratory, the prototype was deployed in Sallie's Fen in late August 2010. An additional four instruments were deployed the following month. Audio data was recorded continuously using a digital audio recorder attached to two ebullition sensors. Audio was recorded as an mp3 compressed audio file at a sample rate of 160 kbits/sec. Using this format and stereo input, allowing for two sensors to be recorded with each device, we were able to record continuously for 20 days. Audio was converted to uncompressed audio files for speed in computation. Audio data was processed using MATLAB, searching in 0.5 second incremental sections for specific fundamental frequencies that are related to our calibrated audio events. Time, fundamental frequency, and estimated bubble size were output to a text file for analysis in statistical software. In addition, each event was cut out of the longer audio file and placed in a directory with number of ebullition event, sensor number, and time, allowing for manual interpretation of the ebullition event. After successful laboratory and local field testing, our instruments were deployed in summer 2011 at a temperate fen (Sallie's Fen, NH, USA), a subarctic mire and lake (Stordalen, Abisko, Sweden) and two locations in subarctic Alaska (APEX Research Site, Fairbanks, AK and Innoko National Wildlife Refuge). Ebullition occurred at regular intervals. Our results indicate that this is a useful method for monitoring CH4 ebullitive flux at high temporal frequencies.
R2R Eventlogger: Community-wide Recording of Oceanographic Cruise Science Events
NASA Astrophysics Data System (ADS)
Maffei, A. R.; Chandler, C. L.; Stolp, L.; Lerner, S.; Avery, J.; Thiel, T.
2012-12-01
Methods used by researchers to track science events during a science research cruise - and to note when and where these occur - varies widely. Handwritten notebooks, printed forms, watch-keeper logbooks, data-logging software, and customized software have all been employed. The quality of scientific results is affected by the consistency and care with which such events are recorded and integration of multi-cruise results is hampered because recording methods vary widely from cruise to cruise. The Rolling Deck to Repository (R2R) program has developed an Eventlogger system that will eventually be deployed on most vessels in the academic research fleet. It is based on the open software package called ELOG (http://midas.psi.ch/elog/) originally authored by Stefan Ritt and enhanced by our team. Lessons have been learned in its development and use on several research cruises. We have worked hard to find approaches that encourage cruise participants to use tools like the eventlogger. We examine these lessons and several eventlogger datasets from past cruises. We further describe how the R2R Science Eventlogger works in concert with the other R2R program elements to help coordinate research vessels into a coordinated mobile observing fleet. Making use of data collected on different research cruises is enabled by adopting common ways of describing science events, the science instruments employed, the data collected, etc. The use of controlled vocabularies and the practice of mapping these local vocabularies to accepted oceanographic community vocabularies helps to bind shipboard research events from different cruises into a more cohesive set of fleet-wide events that can be queried and examined in a cross-cruise manner. Examples of the use of the eventlogger during multi-cruise oceanographic research programs along with examples of resultant eventlogger data will be presented. Additionally we will highlight the importance of vocabulary use strategies to the success of the Eventlogger use by the research community. The R2R Science Eventlogger runs on a dedicated "plugable" linux computer installed on each research vessel network. This R2R web server has been designed so it can be extended to support future R2R services. Best practice documents supporting increased consistency for underway instrument data collection, quality assessment of underway instrument data, and other useful capabilities made available on this common shipboard server platform will begin to provide a common set of web-services and science software tools for the "fleet-observatory".; Screenshot of customizable science event entry form that is part of the R2R Science Eventlogger software package. Latitude and Longitude are automatically added at the time the entry is made.
77 FR 74144 - Federal Motor Vehicle Safety Standards; Event Data Recorders
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... submitted to NHTSA through one of the preceding methods and a copy should also be sent to the Office of... and Crash Test Performance Requirements D. NHTSA's Validation of and Reliance on EDR Data in Its Crash... for the purpose of post-crash assessment of vehicle safety system performance.\\1\\ EDR data are used to...
Code of Federal Regulations, 2010 CFR
2010-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
SEDIMENT DATA - COMMENCEMENT BAY HYLEBOS WATERWAY - TACOMA, WA - PRE-REMEDIAL DESIGN PROGRAM
Event 1A/1B Data Files URL address: http://www.epa.gov/r10earth/datalib/superfund/hybos1ab.htm. Sediment Chemistry Data (Database Format): HYBOS1AB.EXE is a self-extracting file which expands to the single-value per record .DBF format database file HYBOS1AB.DBF. This file contai...
Code of Federal Regulations, 2011 CFR
2011-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2014 CFR
2014-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2012 CFR
2012-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Code of Federal Regulations, 2013 CFR
2013-10-01
... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...
Biodiversity: past, present, and future
NASA Technical Reports Server (NTRS)
Sepkoski, J. J. Jr; Sepkoski JJ, J. r. (Principal Investigator)
1997-01-01
Data from the fossil record are used to illustrate biodiversity in the past and estimate modern biodiversity and loss. This data is used to compare current rates of extinction with past extinction events. Paleontologists are encouraged to use this data to understand the course and consequences of current losses and to share this knowledge with researchers interested in conservation and ecology.
NASA Astrophysics Data System (ADS)
Strasser, Michael; Kopf, Achim; Kanamatsu, Toshyia; Moernaut, Jasper; Ikehara, Ken; McHugh, Cecila
2017-04-01
Our perspective of subduction zonés earthquake magnitude and recurrence is limited by short historical records. Examining prehistoric extreme events preserved in the geological record is essential towards understanding large earthquakes and assessing the geohazard potential associated with such rare events. The research field of "subaquatic paleoseismology" is a promising approach to investigate deposits from the deep sea, where earthquakes leave traces preserved in stratigraphic succession. However, at present we lack comprehensive data set that allow conclusive distinctions between quality and completeness of the paleoseismic archives as they may relate to different sediment transport, erosion and deposition processes vs. variability of intrinsic seismogenic behavior across different segments. Initially building on what sedimentary deposits were generated from the 2011 Magnitude 9 Tohoku-oki earthquake, the Japan Trench is a promising study area to investigate earthquake-triggered sediment remobilization processes and how they become embedded in the stratigraphic record. Here we present new results from the recent R/V Sonne expedition SO251 that acquired a complete high-resolution bathymetric map of the trench axis and nearly 2000 km of subbottom Parasound profiles, covering the entire along-strike extent of the Japan Trench from 36° to 40.3° N, and groundtruthed by several nearly 10m long piston cores retrieved from the very deep waters (7 to 8 km below sea level): Several smaller submarine landslide (up to several 100's m of lateral extent) can be identified along the trench axis in the new bathymetric data set. These features were either not yet present, or not resolved in the lower-resolution bathymetric dataset acquired before 2011. Sub-bottom acoustic reflection data reveals striking, up to several meter thick, acoustically transparent bodies interbedded in the otherwise parallel reflection pattern of the trench fill basins, providing a temporal and spatial inventory of major sediment remobilization events along the Japan Trench with potential quantitative constraints on volumes and mass fluxes of material mobilized during each event. Also the cores from the southern and northern part of the Japan Trench confirm previous findings from the central part near the Tohoku-oki epicenter, that the small deep-sea trench-fill basins, that are associated with very high sedimentation rates, comprise repeated thick turbidite sequences to be further tested for correlation to historic earthquakes. Eventually, the results of Cruise SO251 will be integrated with cores and data from various other cruises to provide a solid base for later long-coring efforts and scientific drilling, as proposed within the IODP JTRACK initiative, towards potentially producing a fascinating record unravelling an earthquake history that is 10 to a 100 times longer than currently available information.
Effects of ENSO on weather-type frequencies and properties at New Orleans, Louisiana, USA
McCabe, G.J.; Muller, R.A.
2002-01-01
Examination of historical climate records indicates a significant relation between the El Nin??o/Southern Oscillation (ENSO) and seasonal temperature and precipitation in Louisiana. In this study, a 40 yr record of twice daily (06:00 and 15:00 h local time) weather types are used to study the effects of ENSO variability on the local climate at New Orleans, Louisiana. Tropical Pacific sea-surface temperatures (SSTs) for the NINO3.4 region are used to define ENSO events (i.e. El Nin??o and La Nin??a events), and daily precipitation and temperature data for New Orleans are used to define weather-type precipitation and temperature properties. Data for winters (December through February) 1962-2000 are analyzed. The 39 winters are divided into 3 categories; winters with NINO3.4 SST anomalies 1??C (El Nin??o events), and neutral conditions (all other years). For each category, weather-type frequencies and properties (i.e. precipitation and temperature) are determined and analyzed. Results indicate that El Nin??o events primarily affect precipitation characteristics of weather types at New Orleans, whereas the effects of La Nin??a events are most apparent in weather-type frequencies. During El Nin??o events, precipitation for some of the weather types is greater than during neutral and La Nin??a conditions and is related to increased water vapor transport from the Tropics to the Gulf of Mexico. The changes in weather-type frequencies during La Nin??a events are indicative of a northward shift in storm tracks and/or a decrease in storm frequency in southern Louisiana.
Characterization of the San Andreas Fault at Parkfield Using a Massive 3D VSP
NASA Astrophysics Data System (ADS)
Chavarria, J.; Goertz, A.; Karrenbach, M.; Milligan, P.; Paulsson, B.
2005-12-01
In preparation for the drilling of SAFOD's Phase II we installed an 80 level array of 3C seismometers inside the well. The goal of the array was to refine the existing velocity model to better locate the target events, and to monitor the local seismicity. The array, with sensors laying mostly within the deviated portion of the well, spans depths ranging from 2.7 to 1.5 km with levels every 15 m. It is this dense spacing what makes 3D VSP capable of bridging the gap between drill-hole observations and observations from the surface like 2D seismics. During April and May 2005 we recorded thirteen far offset shots surrounding the SAFOD site and target event area. Data from these shots was simultaneously recorded by the surface networks and used for better location of the target events. In addition to these, a zero offset shot at SAFOD was generated to refine the structure surrounding the well. The 1D velocity model inverted from the zero offset is representative of the current geologic model at SAFOD. The complexity of the velocity model for this segment of the fault can be inferred from deviations between the zero offset model and the shorter wavelength model derived from well logs. In addition to strong changes in velocity, both zero offset and far offset shots show the presence of strong scattered phases associated to the complex geologic structure of the San Andreas Fault Zone. In addition to the active portion of the experiment we monitored the local seismicity (i.e. aftershocks from the Parkfield 2004 event) over a period of 13 days. During this period of time we recorded continuously at high sampling rates (4kHz) a large number of events, some of which were located by the surface networks and felt onsite. The quiet environment in the borehole enabled us to record microearthquakes that were not present in the NCEDC catalog. In some cases these small events were not even recorded along the entire array. Besides its high level of event detection, the high vector fidelity of the 3C geophones allowed for precise particle motion analysis of first arrivals to determine the location of microearthquakes recorded during this effort.
[Natural disasters and health: an analysis of the situation in Brazil].
Freitas, Carlos Machado de; Silva, Diego Ricardo Xavier; Sena, Aderita Ricarda Martins de; Silva, Eliane Lima; Sales, Luiz Belino Ferreira; Carvalho, Mauren Lopes de; Mazoto, Maíra Lopes; Barcellos, Christovam; Costa, André Monteiro; Oliveira, Mara Lúcia Carneiro; Corvalán, Carlos
2014-09-01
Natural disasters are still insufficiently studied and understood within the scope of public health in this country, with impacts in the short and long term. The scope of this article is to analyze the relationship between disasters and their impact on health based on disaster data recorded in the country. The methodology involved the systematization of data and information contained in the Brazilian Atlas of Natural Disasters 1991-2010 and directly from the National Department of Civil Defense (NSCD). Disasters were organized into four categories of events (meteorological; hydrological; climatological; geophysical/geological) and for each of the latter, the data for morbidity, mortality and exposure of those affected were examined, revealing different types of impacts. Three categories of disasters stood out: the hydrological events showed higher percentages of mortality, morbidity and exposure; climatological events had higher percentages of incidents and people affected; the geophysical/geological events had a higher average of exposure and deaths per event. Lastly, a more active participation of the health sector in the post-2015 global political agenda is proposed, particularly events related to sustainable development, climate change and disaster risk reduction.
LHCb trigger streams optimization
NASA Astrophysics Data System (ADS)
Derkach, D.; Kazeev, N.; Neychev, R.; Panin, A.; Trofimov, I.; Ustyuzhanin, A.; Vesterinen, M.
2017-10-01
The LHCb experiment stores around 1011 collision events per year. A typical physics analysis deals with a final sample of up to 107 events. Event preselection algorithms (lines) are used for data reduction. Since the data are stored in a format that requires sequential access, the lines are grouped into several output file streams, in order to increase the efficiency of user analysis jobs that read these data. The scheme efficiency heavily depends on the stream composition. By putting similar lines together and balancing the stream sizes it is possible to reduce the overhead. We present a method for finding an optimal stream composition. The method is applied to a part of the LHCb data (Turbo stream) on the stage where it is prepared for user physics analysis. This results in an expected improvement of 15% in the speed of user analysis jobs, and will be applied on data to be recorded in 2017.
Analyzing Flash Flood Data in an Ultra-Urban Region
NASA Astrophysics Data System (ADS)
Smith, B. K.; Rodriguez, S.
2016-12-01
New York City is an ultra-urban region, with combined sewers and buried stream channels. Traditional flood studies rely on the presence of stream gages to detect flood stage and discharge, but ultra-urban regions frequently lack the surface stream channels and gages necessary for this approach. In this study we aggregate multiple non-traditional data for detecting flash flood events. These data including phone call reports, city records, and, for one particular flood event, news reports and social media reports. These data are compared with high-resolution bias-corrected radar rainfall fields to study flash flood events in New York City. We seek to determine if these non-traditional data will allow for a comprehensive study of rainfall-runoff relationships in New York City. We also seek to map warm season rainfall heterogeneities in the city and to compare them to spatial distribution of reported flood occurrence.
Simons-Morton, Bruce G; Bingham, C Raymond; Ouimet, Marie Claude; Pradhan, Anuj K; Chen, Rusan; Barretto, Andrea; Shope, Jean T
2013-07-01
Teenage risky driving may be due to teenagers not knowing what is risky, preferring risk, or the lack of consequences. Elevated gravitational-force (g-force) events, caused mainly by hard braking and sharp turns, provide a valid measure of risky driving and are the target of interventions using in-vehicle data recording and feedback devices. The effect of two forms of feedback about risky driving events to teenagers only or to teenagers and their parents was tested in a randomized controlled trial. Ninety parent-teen dyads were randomized to one of two groups: (1) immediate feedback to teens (Lights Only); or (2) immediate feedback to teens plus family access to event videos and ranking of the teen relative to other teenage drivers (Lights Plus). Participants' vehicles were instrumented with data recording devices and events exceeding .5 g were assessed for 2 weeks of baseline and 13 weeks of feedback. Growth curve analysis with random slopes yielded a significant decrease in event rates for the Lights Plus group (slope = -.11, p < .01), but no change for the Lights Only group (slope = .05, p = .67) across the 15 weeks. A large effect size of 1.67 favored the Lights Plus group. Provision of feedback with possible consequences associated with parents being informed reduced risky driving, whereas immediate feedback only to teenagers did not. Published by Elsevier Inc.
Cruz, Miguel A; Garcia, Stephanie; Chowdhury, Muhammad A B; Malilay, Josephine; Perea, Nancy; Williams, O Dale
Disaster shelter assessments are environmental health assessments conducted during disaster situations to evaluate the living environment of shelters for hygiene, sanitation, and safety conditions. We conducted a secondary data analysis of shelter assessment records available (n = 108) on ice storms, floods, and tornado events from 1 state jurisdiction. Descriptive statistics were used to analyze results of environmental health deficiencies found in the facilities. The greater numbers of environmental health deficiencies were associated with sanitation (26%), facility physical issues (19%), and food areas (17%). Most deficiencies were reported following ice storms, tornadoes, and flood events. This report describes the first analysis of environmental health deficiencies found in disaster shelters across a spectrum of disaster events. Although the number of records analyzed for this project was small and results may not be generalizable, this new insight into the living environment in shelter facilities offers the first analysis of deficiencies of the shelter operation and living environment that have great potential to affect the safety and health of shelter occupants.
NASA Astrophysics Data System (ADS)
Brázdil, R.; Chromá, K.; Řezníčková, L.; Valášek, H.; Dolák, L.; Stachoň, Z.; Soukalová, E.; Dobrovolný, P.
2014-07-01
Since the second half of the 17th century, tax relief has been available to farmers and landowners to offset flood damage to property (buildings) and land (fields, meadows, pastures, gardens) in South Moravia, Czech Republic. Historically, the written applications for this were supported by a relatively efficient bureaucratic process that left a clear data trail of documentation, preserved at several levels: in the communities affected, in regional offices, and in the Moravian Land Office, all of which are to be found in estate and family collections in the Moravian Land Archives in the city of Brno, the provincial capital. As well as detailed information about damage done and administrative responses to it, data is often preserved as to the flood event itself, the time of its occurrence and its impacts, sometimes together with causes and stages. The final flood database based on taxation records is used here to describe the temporal and spatial density of both flood events and the records themselves. The information derived is used to help create long-term flood chronologies for the Rivers Dyje, Jihlava, Svratka and Morava, combining floods interpreted from taxation records with other documentary data and floods derived from later systematic hydrological measurements (water levels, discharges). Common periods of higher flood frequency appear largely in 1821-1850 and 1921-1950, although this shifts to several other decades for individual rivers. Certain uncertainties are inseparable from flood data taxation records: their spatial and temporal incompleteness; the inevitable limitation to larger-scale damage and to the summer half-year; and the different characters of rivers, including land-use changes and channel modifications. Taxation data has great potential for extending our knowledge of past floods for the rest of the Czech Republic as well, not to mention other European countries in which records have survived.
The use of taxation records in assessing historical floods in South Moravia, Czech Republic
NASA Astrophysics Data System (ADS)
Brázdil, R.; Chromá, K.; Řezníčková, L.; Valášek, H.; Dolák, L.; Stachoň, Z.; Soukalová, E.; Dobrovolný, P.
2014-10-01
Since the second half of the 17th century, tax relief has been available to farmers and landowners to offset flood damage to property (buildings) and land (fields, meadows, pastures, gardens) in South Moravia, Czech Republic. Historically, the written applications for this were supported by a relatively efficient bureaucratic process that left a clear data trail of documentation, preserved at several levels: in the communities affected, in regional offices, and in the Moravian Land Office, all of which are to be found in estate and family collections in the Moravian Land Archives in the city of Brno, the provincial capital. As well as detailed information about damage done and administrative responses to it, data are often preserved as to the flood event itself, the time of its occurrence and its impacts, sometimes together with causes and stages. The final flood database based on taxation records is used here to describe the temporal and spatial density of both flood events and the records themselves. The information derived is used to help create long-term flood chronologies for the rivers Dyje, Jihlava, Svratka and Morava, combining floods interpreted from taxation records with other documentary data and floods derived from later systematic hydrological measurements (water levels, discharges). Common periods of higher flood frequency appear largely in the periods 1821-1850 and 1921-1950, although this shifts to several other decades for individual rivers. A number of uncertainties are inseparable from flood data taxation records: their spatial and temporal incompleteness; the inevitable limitation to larger-scale damage and restriction to the summer half-year; and the different characters of rivers, including land-use changes and channel modifications. Taxation data have considerable potential for extending our knowledge of past floods for the rest of the Czech Republic, not to mention other European countries in which records have survived.
MyShake: Smartphone-based detection and analysis of Oklahoma earthquakes
NASA Astrophysics Data System (ADS)
Kong, Q.; Allen, R. M.; Schreier, L.
2016-12-01
MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing (myshake.berkeley.edu). It uses the accelerometer data from phones to detect earthquake-like motion, and then uploads triggers and waveform data to a server for aggregation of the results. Since the public release in Feb 2016, more than 200,000 android-phone owners have installed the app, and the global network has recorded more than 300 earthquakes. In Oklahoma, there are about 200 active users each day providing enough data for the network to detect earthquakes and for us to perform analysis of the events. MyShake has recorded waveform data for M2.6 to M5.8 earthquakes in the state. For the September 3, 2016, M5.8 earthquake 14 phones detected the event and we can use the waveforms to determine event characteristics. MyShake data provides a location 3.95 km from the ANSS location and a magnitude of 5.7. We can also use MyShake data to estimate a stress drop of 7.4 MPa. MyShake is still a rapidly expanding network that has the ability to grow by thousands of stations/phones in a matter of hours as public interest increases. These initial results suggest that the data will be useful for a variety of scientific studies of induced seismicity phenomena in Oklahoma as well as having the potential to provide earthquake early warning in the future.
A fixed mass method for the Kramers-Moyal expansion--application to time series with outliers.
Petelczyc, M; Żebrowski, J J; Orłowska-Baranowska, E
2015-03-01
Extraction of stochastic and deterministic components from empirical data-necessary for the reconstruction of the dynamics of the system-is discussed. We determine both components using the Kramers-Moyal expansion. In our earlier papers, we obtained large fluctuations in the magnitude of both terms for rare or extreme valued events in the data. Calculations for such events are burdened by an unsatisfactory quality of the statistics. In general, the method is sensitive to the binning procedure applied for the construction of histograms. Instead of the commonly used constant width of bins, we use here a constant number of counts for each bin. This approach-the fixed mass method-allows to include in the calculation events, which do not yield satisfactory statistics in the fixed bin width method. The method developed is general. To demonstrate its properties, here, we present the modified Kramers-Moyal expansion method and discuss its properties by the application of the fixed mass method to four representative heart rate variability recordings with different numbers of ectopic beats. These beats may be rare events as well as outlying, i.e., very small or very large heart cycle lengths. The properties of ectopic beats are important not only for medical diagnostic purposes but the occurrence of ectopic beats is a general example of the kind of variability that occurs in a signal with outliers. To show that the method is general, we also present results for two examples of data from very different areas of science: daily temperatures at a large European city and recordings of traffics on a highway. Using the fixed mass method, to assess the dynamics leading to the outlying events we studied the occurrence of higher order terms of the Kramers-Moyal expansion in the recordings. We found that the higher order terms of the Kramers-Moyal expansion are negligible for heart rate variability. This finding opens the possibility of the application of the Langevin equation to the whole range of empirical signals containing rare or outlying events. Note, however, that the higher order terms are non-negligible for the other data studied here and for it the Langevin equation is not applicable as a model.
Mach-Zehnder interferometer-based recording system for WACO
NASA Astrophysics Data System (ADS)
Woerner, R.
1988-06-01
EG and G Energy Measurements, Inc., Los Alamos Operations (LAO) designed and built a Mach-Zehnder-interferometer-based recording system to record low-bandwidth pulses. This work was undertaken at the request of the Los Alamos National Laboratory, P-14 Fast Transient Plasma Measurement group. The system was fielded on WACO and its performance compared with that of a conventional recording system fielded on the same event. The results of the fielding showed that for low bandwidth applications like the WACO experiment, the M-Z-based system provides the same data quality and dynamic range as the conventional oscilloscope system, but it is far less complex and uses fewer recorders.
NASA Astrophysics Data System (ADS)
Piecuch, C. G.; Huybers, P. J.; Tingley, M.
2015-12-01
Tide gauge records of mean sea level are some of the most valuable instrumental time series of oceanic variability and change. Yet these time series sometimes have short record lengths and intermittently missing values. Such issues can limit the utility of the data, for example, precluding rigorous analyses of return periods of extreme mean sea level events and whether they are unprecedented. With a view to filling gaps in the tide gauge mean sea level time series, we describe a hierarchical Bayesian modeling approach. The model, which is predicated on the notion of conditional probabilities, comprises three levels: a process level, which casts mean sea level as a field with spatiotemporal covariance; a data level, which represents tide gauge observations as noisy, biased versions of the true process; and a prior level, which gives prior functional forms to model parameters. Using Bayes' rule, this technique gives estimates of the posterior probability of the process and the parameters given the observations. To demonstrate the approach, we apply it to 2,967 station-years of annual mean sea level observations over 1856-2013 from 70 tide gauges along the United States East Coast from Florida to Maine (i.e., 26.8% record completeness). The model overcomes the data paucity by sharing information across space and time. The result is an ensemble of realizations, each member of which is a possible history of sea level changes at these locations over this period, which is consistent with and equally likely given the tide gauge data and underlying model assumptions. Using the ensemble of histories furnished by the Bayesian model, we identify extreme events of mean sea level change in the tide gauge time series. Specifically, we use the model to address the particular hypothesis (with rigorous uncertainty quantification) that a recently reported interannual sea level rise during 2008-2010 was unprecedented in the instrumental record along the northeast coast of North America, and that it had a return period of 850 years. Preliminary analysis suggests that this event was likely unprecedented on the coast of Maine in the last century.
Concorde noise-induced building vibrations, John F. Kennedy International Airport
NASA Technical Reports Server (NTRS)
Mayes, W. H.; Deloach, R.; Stephens, D. G.; Cawthorn, J. M.; Holmes, H. K.; Lewis, R. B.; Holliday, B. G.; Miller, W. T.; Ward, D. W.
1978-01-01
The outdoor/indoor noise levels and associated vibration levels resulting from aircraft and nonaircraft events were recorded at eight homesites and a school. In addition, limited subjective tests were conducted to examine the human detection/annoyance thresholds for building vibration and rattle caused by aircraft noise. Presented herein are the majority of the window and wall vibration data recorded during Concorde and subsonic aircraft overflights.
Cathy Whitlock; Carl N. Skinner; Patrick J. Bartlein; Thomas Minckley; Jerry A. Mohr
2004-01-01
Fire-history reconstructions are based on tree-ring records that span the last few centuries and charcoal data from lake-sediment cores that extend back several thousand years. The two approaches have unique strengths and weaknesses in their ability to depict past fire events and fire regimes, and most comparisons of these datasets in western conifer forests have...
Earley, Amy; Lau, Joseph; Uhlig, Katrin
2013-01-18
A participant death is a serious event in a clinical trial and needs to be unambiguously and publicly reported. To examine (1) how often and how numbers of deaths are reported in ClinicalTrials.gov records; (2) how often total deaths can be determined per arm within a ClinicalTrials.gov results record and its corresponding publication and (3) whether counts may be discordant. Registry-based study of clinical trial results reporting. ClinicalTrials.gov results database searched in July 2011 and matched PubMed publications. A random sample of ClinicalTrials.gov results records. Detailed review of records with a single corresponding publication. ClinicalTrials.gov records reporting number of deaths under participant flow, primary or secondary outcome or serious adverse events. Consistency in reporting of number of deaths between ClinicalTrials.gov records and corresponding publications. In 500 randomly selected ClinicalTrials.gov records, only 123 records (25%) reported a number for deaths. Reporting of deaths across data modules for participant flow, primary or secondary outcomes and serious adverse events was variable. In a sample of 27 pairs of ClinicalTrials.gov records with number of deaths and corresponding publications, total deaths per arm could only be determined in 56% (15/27 pairs) but were discordant in 19% (5/27). In 27 pairs of ClinicalTrials.gov records without any information on number of deaths, 48% (13/27) were discordant since the publications reported absence of deaths in 33% (9/27) and positive death numbers in 15% (4/27). Deaths are variably reported in ClinicalTrials.gov records. A reliable total number of deaths per arm cannot always be determined with certainty or can be discordant with number reported in corresponding trial publications. This highlights a need for unambiguous and complete reporting of the number of deaths in trial registries and publications.
Earley, Amy; Lau, Joseph; Uhlig,, Katrin
2013-01-01
Context A participant death is a serious event in a clinical trial and needs to be unambiguously and publicly reported. Objective To examine (1) how often and how numbers of deaths are reported in ClinicalTrials.gov records; (2) how often total deaths can be determined per arm within a ClinicalTrials.gov results record and its corresponding publication and (3) whether counts may be discordant. Design Registry-based study of clinical trial results reporting. Setting ClinicalTrials.gov results database searched in July 2011 and matched PubMed publications. Selection criteria A random sample of ClinicalTrials.gov results records. Detailed review of records with a single corresponding publication. Main outcome measure ClinicalTrials.gov records reporting number of deaths under participant flow, primary or secondary outcome or serious adverse events. Consistency in reporting of number of deaths between ClinicalTrials.gov records and corresponding publications. Results In 500 randomly selected ClinicalTrials.gov records, only 123 records (25%) reported a number for deaths. Reporting of deaths across data modules for participant flow, primary or secondary outcomes and serious adverse events was variable. In a sample of 27 pairs of ClinicalTrials.gov records with number of deaths and corresponding publications, total deaths per arm could only be determined in 56% (15/27 pairs) but were discordant in 19% (5/27). In 27 pairs of ClinicalTrials.gov records without any information on number of deaths, 48% (13/27) were discordant since the publications reported absence of deaths in 33% (9/27) and positive death numbers in 15% (4/27). Conclusions Deaths are variably reported in ClinicalTrials.gov records. A reliable total number of deaths per arm cannot always be determined with certainty or can be discordant with number reported in corresponding trial publications. This highlights a need for unambiguous and complete reporting of the number of deaths in trial registries and publications. PMID:23335556
The role of citizen science in monitoring small-scale pollution events.
Hyder, Kieran; Wright, Serena; Kirby, Mark; Brant, Jan
2017-07-15
Small-scale pollution events involve the release of potentially harmful substances into the marine environment. These events can affect all levels of the ecosystem, with damage to both fauna and flora. Numerous reporting structures are currently available to document spills, however there is a lack of information on small-scale events due to their magnitude and patchy distribution. To this end, volunteers may provide a useful tool in filling this data gap, especially for coastal environments with a high usage by members of the public. The potential for citizen scientists to record small-scale pollution events is explored using the UK as an example, with a focus on highlighting methods and issues associated with using this data source. An integrated monitoring system is proposed which combines citizen science and traditional reporting approaches. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
A Visual Analytics Framework for Identifying Topic Drivers in Media Events.
Lu, Yafeng; Wang, Hong; Landis, Steven; Maciejewski, Ross
2017-09-14
Media data has been the subject of large scale analysis with applications of text mining being used to provide overviews of media themes and information flows. Such information extracted from media articles has also shown its contextual value of being integrated with other data, such as criminal records and stock market pricing. In this work, we explore linking textual media data with curated secondary textual data sources through user-guided semantic lexical matching for identifying relationships and data links. In this manner, critical information can be identified and used to annotate media timelines in order to provide a more detailed overview of events that may be driving media topics and frames. These linked events are further analyzed through an application of causality modeling to model temporal drivers between the data series. Such causal links are then annotated through automatic entity extraction which enables the analyst to explore persons, locations, and organizations that may be pertinent to the media topic of interest. To demonstrate the proposed framework, two media datasets and an armed conflict event dataset are explored.
Lo Re, Vincent; Carbonari, Dena M; Saine, M Elle; Newcomb, Craig W; Roy, Jason A; Liu, Qing; Wu, Qufei; Cardillo, Serena; Haynes, Kevin; Kimmel, Stephen E; Reese, Peter P; Margolis, David J; Apter, Andrea J; Reddy, K Rajender; Hennessy, Sean; Bhullar, Harshvinder; Gallagher, Arlene M; Esposito, Daina B; Strom, Brian L
2017-01-01
To evaluate the risk of serious adverse events among patients with type 2 diabetes mellitus initiating saxagliptin compared with oral antidiabetic drugs (OADs) in classes other than dipeptidyl peptidase-4 (DPP-4) inhibitors. Cohort studies using 2009-2014 data from two UK medical record data sources (Clinical Practice Research Datalink, The Health Improvement Network) and two USA claims-based data sources (HealthCore Integrated Research Database, Medicare). All eligible adult patients newly prescribed saxagliptin (n=110 740) and random samples of up to 10 matched initiators of non-DPP-4 inhibitor OADs within each data source were selected (n=913 384). Outcomes were hospitalized major adverse cardiovascular events (MACE), acute kidney injury (AKI), acute liver failure (ALF), infections, and severe hypersensitivity events, evaluated using diagnostic coding algorithms and medical records. Cox regression was used to determine HRs with 95% CIs for each outcome. Meta-analyses across data sources were performed for each outcome as feasible. There were no increased incidence rates or risk of MACE, AKI, ALF, infection, or severe hypersensitivity reactions among saxagliptin initiators compared with other OAD initiators within any data source. Meta-analyses demonstrated a reduced risk of hospitalization/death from MACE (HR 0.91, 95% CI 0.85 to 0.97) and no increased risk of hospitalization for infection (HR 0.97, 95% CI 0.93 to 1.02) or AKI (HR 0.99, 95% CI 0.88 to 1.11) associated with saxagliptin initiation. ALF and hypersensitivity events were too rare to permit meta-analysis. Saxagliptin initiation was not associated with increased risk of MACE, infection, AKI, ALF, or severe hypersensitivity reactions in clinical practice settings. NCT01086280, NCT01086293, NCT01086319, NCT01086306, and NCT01377935; Results.
Monitoring the Earth's Atmosphere with the Global IMS Infrasound Network
NASA Astrophysics Data System (ADS)
Brachet, Nicolas; Brown, David; Mialle, Pierrick; Le Bras, Ronan; Coyne, John; Given, Jeffrey
2010-05-01
The Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is tasked with monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) which bans nuclear weapon explosions underground, in the oceans, and in the atmosphere. The verification regime includes a globally distributed network of seismic, hydroacoustic, infrasound and radionuclide stations which collect and transmit data to the International Data Centre (IDC) in Vienna, Austria shortly after the data are recorded at each station. The infrasound network defined in the Protocol of the CTBT comprises 60 infrasound array stations. Each array is built according to the same technical specifications, it is typically composed of 4 to 9 sensors, with 1 to 3 km aperture geometry. At the end of 2000 only one infrasound station was transmitting data to the IDC. Since then, 41 additional stations have been installed and 70% of the infrasound network is currently certified and contributing data to the IDC. This constitutes the first global infrasound network ever built with such a large and uniform distribution of stations. Infrasound data at the IDC are processed at the station level using the Progressive Multi-Channel Correlation (PMCC) method for the detection and measurement of infrasound signals. The algorithm calculates the signal correlation between sensors at an infrasound array. If the signal is sufficiently correlated and consistent over an extended period of time and frequency range a detection is created. Groups of detections are then categorized according to their propagation and waveform features, and a phase name is assigned for infrasound, seismic or noise detections. The categorization complements the PMCC algorithm to avoid overwhelming the IDC automatic association algorithm with false alarm infrasound events. Currently, 80 to 90% of the detections are identified as noise by the system. Although the noise detections are not used to build events in the context of CTBT monitoring, they represent valuable data for other civil applications like monitoring of natural hazards (volcanic activity, storm tracking) and climate change. Non-noise detections are used in network processing at the IDC along with seismic and hydroacoustic technologies. The arrival phases detected on the three waveform technologies may be combined and used for locating events in an automatically generated bulletin of events. This automatic event bulletin is routinely reviewed by analysts during the interactive review process. However, the fusion of infrasound data with the other waveform technologies has only recently (in early 2010) become part of the IDC operational system, after a software development and testing period that began in 2004. The build-up of the IMS infrasound network, the recent developments of the IDC infrasound software, and the progress accomplished during the last decade in the domain of real-time atmospheric modelling have allowed better understanding of infrasound signals and identification of a growing data set of ground-truth sources. These infragenic sources originate from natural or man-made sources. Some of the detected signals are emitted by local or regional phenomena recorded by a single IMS infrasound station: man-made cultural activity, wind farms, aircraft, artillery exercises, ocean surf, thunderstorms, rumbling volcanoes, iceberg calving, aurora, avalanches. Other signals may be recorded by several IMS infrasound stations at larger distances: ocean swell, sonic booms, and mountain associated waves. Only a small fraction of events meet the event definition criteria considering the Treaty verification mission of the Organization. Candidate event types for the IDC Reviewed Event Bulletin include atmospheric or surface explosions, meteor explosions, rocket launches, signals from large earthquakes and explosive volcanic eruptions.
2014-09-18
OSHA is issuing a final rule to update the appendix to its Injury and Illness Recording and Reporting regulation. The appendix contains a list of industries that are partially exempt from requirements to keep records of work-related injuries and illnesses due to relatively low occupational injury and illness rates. The updated appendix is based on more recent injury and illness data and lists industry groups classified by the North American Industry Classification System (NAICS). The current appendix lists industries classified by Standard Industrial Classification (SIC). The final rule also revises the requirements for reporting work-related fatality, injury, and illness information to OSHA. The current regulation requires employers to report work-related fatalities and in-patient hospitalizations of three or more employees within eight hours of the event. The final rule retains the requirement for employers to report work-related fatalities to OSHA within eight hours of the event but amends the regulation to require employers to report all work-related in-patient hospitalizations, as well as amputations and losses of an eye, to OSHA within 24 hours of the event.
NASA Astrophysics Data System (ADS)
Huang, Wei; Wang, Yongjin; Cheng, Hai; Edwards, Richard Lawrence; Shen, Chuan-Chou; Liu, Dianbing; Shao, Qingfeng; Deng, Chao; Zhang, Zhenqiu; Wang, Quan
2016-07-01
We present two isotopic (δ18O and δ13C) sequences of a twin-stalagmite from Zhuliuping Cave, southwestern China, with 230Th dates from 14.6 to 4.6 ka. The stalagmite δ18O record characterizes orbital- to decadal-scale variability of Asian summer monsoon (ASM) intensity, with the Holocene optimum period (HOP) between 9.8 and 6.8 ka BP which is reinforced by its co-varying δ13C data. The large multi-decadal scale amplitude of the cave δ18O indicates its high sensitivity to climate change. Four centennial-scale weak ASM events during the early Holocene are centered at 11.2, 10.8, 9.1 and 8.2 ka. They can be correlated to cold periods in the northern high latitudes, possibly resulting from rapid dynamics of atmospheric circulation associated with North Atlantic cooling. The 8.2 ka event has an amplitude more than two-thirds that of the Younger Dryas (YD), and is significantly stronger than other cave records in the Asia monsoon region, likely indicating a more severe dry climate condition at the cave site. At the end of the YD event, the δ13C record lags the δ18O record by 300-500 yr, suggesting a multi-centennial slow response of vegetation and soil processes to monsoon enhancement.
Rogers, A.M.; Covington, P.A.; Park, R.B.; Borcherdt, R.D.; Perkins, D.M.
1980-01-01
This report presents a collection of Nevada Test Site (NTS) nuclear explosion recordings obtained at sites in the greater Los Angeles, Calif., region. The report includes ground velocity time histories, as well as, derived site transfer functions. These data have been collected as part of a study to evaluate the validity of using low-level ground motions to predict the frequency-dependent response of a site during an earthquake. For this study 19 nuclear events were recorded at 98 separate locations. Some of these sites have recorded more than one of the nuclear explosions, and, consequently, there are a total of 159, three-component station records. The location of all the recording sites are shown in figures 1–5, the station coordinates and abbreviations are given in table 1. The station addresses are listed in table 2, and the nuclear explosions that were recorded are listed in table 3. The recording sites were chosen on the basis of three criteria: (1) that the underlying geological conditions were representative of conditions over significant areas of the region, (2) that the site was the location of a strong-motion recording of the 1971 San Fernando earthquake, or (3) that more complete geographical coverage was required in that location.
Concorde noise-induced building vibrations for Sully Plantation, Chantilly, Virginia
NASA Technical Reports Server (NTRS)
Mayes, W. H.; Scholl, H. F.; Stephens, D. G.; Holliday, B. G.; Deloach, R.; Holmes, H. K.; Lewis, R. B.; Lynch, J. W.
1976-01-01
A study to assess the noise-induced building vibrations associated with Concorde operations is presented. The approach is to record the levels of induced vibrations and associated indoor/outdoor noise levels in selected homes, historic and other buildings near Dulles and Kennedy International Airports. Presented is a small, representative sample of data recorded at Sully Plantation, Chantilly, Virginia during the period of May 20 through May 28, 1976. Recorded data provide relationships between the vibration levels of walls, floors, windows, and the noise associated with Concorde operations (2 landings and 3 takeoffs), other aircraft, nonaircraft sources, and normal household activities. Results suggest that building vibrations resulting from aircraft operations were proportional to the overall sound pressure levels and relatively insensitive to spectral differences associated with the different types of aircraft. Furthermore, the maximum levels of vibratory response resulting from Concorde operations were higher than those associated with conventional aircraft. The vibrations of nonaircraft events were observed in some cases to exceed the levels resulting from aircraft operations. These nonaircraft events are currently being analyzed in greater detail.
Capturing commemoration: Using mobile recordings within memory research
Birdsall, Carolyn; Drozdzewski, Danielle
2017-01-01
This paper details the contribution of mobile devices to capturing commemoration in action. It investigates the incorporation of audio and sound recording devices, observation, and note-taking into a mobile (auto)ethnographic research methodology, to research a large-scale commemorative event in Amsterdam, the Netherlands. On May 4, 2016, the sounds of a Silent March—through the streets of Amsterdam to Dam Square—were recorded and complemented by video grabs of the march’s participants and onlookers. We discuss how the mixed method enabled a multilevel analysis across visual, textual, and aural layers of the commemorative atmosphere. Our visual data aided in our evaluation of the construction of collective spectacle, while the audio data necessitated that we venture into new analytic territory. Using Sonic Visualiser, we uncovered alternative methods of “reading” landscape by identifying different sound signatures in the acoustic environment. Together, this aural and visual representation of the May 4 events enabled the identification of spatial markers and the temporal unfolding of the Silent March and the national 2 minutes’ silence in Amsterdam’s Dam Square. PMID:29780585
Power Plant Model Validation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less
NASA Astrophysics Data System (ADS)
Pavlov, V.; Shatsillo, A.; Kouznetsov, N.; Gazieva, E.
2017-12-01
There is a range of evidence, mainly from sedimentary and volcanic rocks of the Laurentia and Baltica cratons, that argue for the anomalous character of the Ediacaran-Early Cambrian paleomagnetic record. This feature could be linked either to some peculiarities of the paleomagnetic record itself or to some unusual geophysical event that would have taken place around the Proterozoic-Phanerozoic boundary (e.g., true polar wander or nonuniformitarian geomagnetic field behavior). In the latter case, the traces of this event should be observed in Ediacaran-Early Cambrian rocks anywhere there is a possibility to observe a primary paleomagnetic signal. In previous work, we reported results that suggested an anomalous paleomagnetic record in Siberian Ediacaran-Lower Cambrian rocks. Here we present new Siberian data that indicate a very high geomagnetic reversal frequency during this period and the coexistence of two very different paleomagnetic directions. We speculate that these features could be due either to a near-equatorial geomagnetic dipole during the polarity transitions or to alternation between axial and near equatorial dipoles not directly linked with polarity reversals.
Major earthquakes recorded by Speleothems in Midwestern U.S. caves
Panno, S.V.; Lundstrom, C.C.; Hackley, Keith C.; Curry, B. Brandon; Fouke, B.W.; Zhang, Z.
2009-01-01
Historic earthquakes generated by the New Madrid seismic zone represent some of the largest recorded in the United States, yet prehistoric events are recognized only through deformation in late-Wisconsin to Holocene-age, near surface sediments (liquefaction, monoclinal folding, and changes in river meanders). In this article, we show that speleothems in caves of southwestern Illinois and southeastern Missouri may constitute a previously unrecognized recorder of large earthquakes in the U.S. midcontinent region. The timing of the initiation and regrowth of stalagmites in southwestern Illinois and southeastern Missouri caves is consistent with the historic and prehistoric record of several known seismic events in the U.S. midcontinent region. We conclude that dating the initiation of original stalagmite growth and later postearthquake rejuvenation constitutes a new paleoseismic method that has the potential for being applied to any region around the world in the vicinity of major seismic zones where caves exist. Use of this technique could expand the geographical distribution of paleoseimic data, document prehistoric earthquakes, and help improve interpretations of paleoearthquakes.
Use of Narrative Nursing Records for Nursing Research
Park, Hyeoun-Ae; Cho, InSook; Ahn, Hee-Jung
2012-01-01
To explore the usefulness of narrative nursing records documented using a standardized terminology-based electronic nursing records system, we conducted three different studies on (1) the gaps between the required nursing care time and the actual nursing care time, (2) the practice variations in pressure ulcer care, and (3) the surveillance of adverse drug events. The narrative nursing notes, documented at the point of care using standardized nursing statements, were extracted from the clinical data repository at a teaching hospital in Korea and analyzed. Our findings were: the pediatric and geriatric units showed relatively high staffing needs; overall incidence rate of pressure ulcer among the intensive-care patients was 15.0% and the nursing interventions provided for pressure-ulcer care varied depending on nursing units; and at least one adverse drug event was noted in 53.0% of the cancer patients who were treated with cisplatin. A standardized nursing terminology-based electronic nursing record system allowed us to explore answers to different various research questions. PMID:24199111
Rupture directivity of microseismic events recorded during hydraulic fracture stimulations.
NASA Astrophysics Data System (ADS)
Urbancic, T.; Smith-Boughner, L.; Baig, A.; Viegas, G.
2016-12-01
We model the dynamics of a complex rupture sequence with four sub-events. These events were recorded during hydraulic fracture stimulations in a gas-bearing shale formation. With force-balance accelerometers, 4.5Hz and 15Hz instruments recording the failure history, we study the directivity of the entire rupture sequence and each sub-event. Two models are considered: unilateral and bi-lateral failures of penny shaped cracks. From the seismic moment tensors of these sub-events, we consider different potential failure planes and rupture directions. Using numerical wave-propagation codes, we generate synthetic rupture sequences with both unilateral and bi-lateral ruptures. These are compared to the four sub-events to determine the directionality of the observed failures and the sensitivity of our recording bandwidth and geometry to distinguishing between different rupture processes. The frequency of unilateral and bilateral rupture processes throughout the fracture stimulation is estimated by comparing the directivity characteristics of the modeled sub-events to other high-quality microseismic events recorded during the same stimulation program. Understanding the failure processes of these microseismic events can provide great insight into the changes in the rock mass responsible for these complex rupture processes.
Gaudet, Daniel; Stroes, Erik S; Méthot, Julie; Brisson, Diane; Tremblay, Karine; Bernelot Moens, Sophie J; Iotti, Giorgio; Rastelletti, Irene; Ardigo, Diego; Corzo, Deyanira; Meyer, Christian; Andersen, Marc; Ruszniewski, Philippe; Deakin, Mark; Bruno, Marco J
2016-11-01
Alipogene tiparvovec (Glybera) is a gene therapy product approved in Europe under the "exceptional circumstances" pathway as a treatment for lipoprotein lipase deficiency (LPLD), a rare genetic disease resulting in chylomicronemia and a concomitantly increased risk of acute and recurrent pancreatitis, with potentially lethal outcome. This retrospective study analyzed the frequency and severity of pancreatitis in 19 patients with LPLD up to 6 years after a single treatment with alipogene tiparvovec. An independent adjudication board of three pancreas experts, blinded to patient identification and to pre- or post-gene therapy period, performed a retrospective review of data extracted from the patients' medical records and categorized LPLD-related acute abdominal pain events requiring hospital visits and/or hospitalizations based on the adapted 2012 Atlanta diagnostic criteria for pancreatitis. Both entire disease time period data and data from an equal time period before and after gene therapy were analyzed. Events with available medical record information meeting the Atlanta diagnostic criteria were categorized as definite pancreatitis; events treated as pancreatitis but with variable levels of laboratory and imaging data were categorized as probable pancreatitis or acute abdominal pain events. A reduction of approximately 50% was observed in all three categories of the adjudicated post-gene therapy events. Notably, no severe pancreatitis and only one intensive care unit admission was observed in the post-alipogene tiparvovec period. However, important inter- and intraindividual variations in the pre- and post-gene therapy incidence of events were observed. There was no relationship between the posttreatment incidence of events and the number of LPL gene copies injected, the administration of immunosuppressive regimen or the percent triglyceride decrease achieved at 12 weeks (primary end point in the prospective clinical studies). Although a causal relationship cannot be established and despite the limited number of individuals evaluated, results from this long-term analysis suggest that alipogene tiparvovec was associated with a lower frequency and severity of pancreatitis events, and a consequent overall reduction in health care resource use up to 6 years posttreatment.
Combining Small-Vertebrate, Marine and Stable-Isotope Data to Reconstruct Past Environments
Rofes, Juan; Garcia-Ibaibarriaga, Naroa; Aguirre, Mikel; Martínez-García, Blanca; Ortega, Luis; Zuluaga, María Cruz; Bailon, Salvador; Alonso-Olazabal, Ainhoa; Castaños, Jone; Murelaga, Xabier
2015-01-01
Three very different records are combined here to reconstruct the evolution of environments in the Cantabrian Region during the Upper Pleistocene, covering ~35.000 years. Two of these records come from Antoliñako Koba (Bizkaia, Spain), an exceptional prehistoric deposit comprising 9 chrono-cultural units (Aurignacian to Epipaleolithic). The palaeoecological signal of small-vertebrate communities and red deer stable-isotope data (δ13C and δ15N) from this mainland site are contrasted to marine microfaunal evidence (planktonic and benthic foraminifers, ostracods and δ18O data) gathered at the southern Bay of Biscay. Many radiocarbon dates for the Antoliña’s sequence, made it possible to compare the different proxies among them and with other well-known North-Atlantic records. Cooling and warming events regionally recorded, mostly coincide with the climatic evolution of the Upper Pleistocene in the north hemisphere. PMID:26391668
Gift-giving and network structure in rural China: utilizing long-term spontaneous gift records.
Chen, Xi
2014-01-01
The tradition of keeping written records of gift received during household ceremonies in many countries offers researchers an underutilized means of data collection for social network analysis. This paper first summarizes unique features of the gift record data that circumvent five prevailing sampling and measurement issues in the literature, and we discuss their advantages over existing studies at both the individual level and the dyadic link level using previous data sources. We then document our research project in rural China that implements a multiple wave census-type household survey and a long-term gift record collection. The pattern of gift-giving in major household social events and its recent escalation is analyzed. There are significantly positive correlations between gift network centrality and various forms of informal insurance. Finally, economic inequality and competitive marriage market are among the main demographic and socioeconomic determinants of the observed gift network structure.
Gift-Giving and Network Structure in Rural China: Utilizing Long-Term Spontaneous Gift Records
Chen, Xi
2014-01-01
The tradition of keeping written records of gift received during household ceremonies in many countries offers researchers an underutilized means of data collection for social network analysis. This paper first summarizes unique features of the gift record data that circumvent five prevailing sampling and measurement issues in the literature, and we discuss their advantages over existing studies at both the individual level and the dyadic link level using previous data sources. We then document our research project in rural China that implements a multiple wave census-type household survey and a long-term gift record collection. The pattern of gift-giving in major household social events and its recent escalation is analyzed. There are significantly positive correlations between gift network centrality and various forms of informal insurance. Finally, economic inequality and competitive marriage market are among the main demographic and socioeconomic determinants of the observed gift network structure. PMID:25111696
Late Eocene impact events recorded in deep-sea sediments
NASA Technical Reports Server (NTRS)
Glass, B. P.
1988-01-01
Raup and Sepkoski proposed that mass extinctions have occurred every 26 Myr during the last 250 Myr. In order to explain this 26 Myr periodicity, it was proposed that the mass extinctions were caused by periodic increases in cometary impacts. One method to test this hypothesis is to determine if there were periodic increases in impact events (based on crater ages) that correlate with mass extinctions. A way to test the hypothesis that mass extinctions were caused by periodic increases in impact cratering is to look for evidence of impact events in deep-sea deposits. This method allows direct observation of the temporal relationship between impact events and extinctions as recorded in the sedimentary record. There is evidence in the deep-sea record for two (possibly three) impact events in the late Eocene. The younger event, represented by the North American microtektite layer, is not associated with an Ir anomaly. The older event, defined by the cpx spherule layer, is associated with an Ir anomaly. However, neither of the two impact events recorded in late Eocene deposits appears to be associated with an unusual number of extinctions. Thus there is little evidence in the deep-sea record for an impact-related mass extinction in the late Eocene.
Annual Hanford Seismic Report for Fiscal Year 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.
2009-12-31
The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. During FY 2009, the Hanford Seismic Network recorded nearly 3000 triggers on the seismometer system, which included over 1700 seismic events in the southeast Washington area and an additional 370 regional and teleseismic events. There were 1648 events determined to be local earthquakes relevant to the Hanford Site. Nearly all of these earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. Recording of the Wooded Island events began in January with over 250 events per month through June 2009. The frequency of events decreased starting in July 2009 to approximately 10-15 events per month through September 2009. Most of the events were considered minor (coda-length magnitude [Mc] less than 1.0) with 47 events in the 2.0-3.0 range. The estimated depths of the Wooded Island events are shallow (averaging less than 1.0 km deep) with a maximum depth estimated at 2.3 km. This places the Wooded Island events within the Columbia River Basalt Group (CRBG). The highest-magnitude event (3.0Mc) occurred on May 13, 2009 within the Wooded Island swarm at depth 1.8 km. With regard to the depth distribution, 1613 earthquakes were located at shallow depths (less than 4 km, most likely in the Columbia River basalts), 18 earthquakes were located at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and 17 earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, 1630 earthquakes were located in swarm areas and 18 earthquakes were classified as random events. The low magnitude of the Wooded Island events has made them undetectable to all but local area residents. However, some Hanford employees working within a few miles of the area of highest activity and individuals living in homes directly across the Columbia River from the swarm center have reported feeling many of the larger magnitude events. The Hanford Strong Motion Accelerometer (SMA) network was triggered numerous times by the Wooded Island swarm events. The maximum acceleration value recorded by the SMA network was approximately 3 times lower than the reportable action level for Hanford facilities (2% g) and no action was required. The swarming is likely due to pressure that has built up, cracking the brittle basalt layers within the Columbia River Basalt Formation (CRBG). Similar earthquake “swarms” have been recorded near this same location in 1970, 1975 and 1988. Prior to the 1970s, swarming may have occurred, but equipment was not in place to record those events. Quakes of this limited magnitude do not pose a risk to Hanford cleanup efforts or waste storage facilities. Since swarms of the past did not intensify in magnitude, seismologists do not expect that these events will increase in intensity. However, Pacific Northwest National Laboratory (PNNL) will continue to monitor the activity.« less
NASA Astrophysics Data System (ADS)
Brendryen, J.; Hannisdal, B.; Haaga, K. A.; Haflidason, H.; Castro, D. D.; Grasmo, K. J.; Sejrup, H. P.; Edwards, R. L.; Cheng, H.; Kelly, M. J.; Lu, Y.
2016-12-01
Abrupt millennial scale climatic events known as Dansgaard-Oeschger events are a defining feature of the Quaternary climate system dynamics in the North Atlantic and beyond. We present a high-resolution multi-proxy record of ocean-ice sheet interactions in the Norwegian Sea spanning the interval between 50 and 150 ka BP. A comparison with low latitude records indicates a very close connection between the high northern latitude ocean-ice sheet interactions and large scale changes in low latitude atmospheric circulation and hydrology even on sub-millennial scales. The records are placed on a common precise radiometric chronology based on correlations to U/Th dated speleothem records from China and the Alps. This enables a comparison of the records to orbital and other climatically important parameters such as U/Th dated sea-level data from corals and speleothems. We explore the drive-response relationships in these coupled systems with the information transfer (IT) and the convergent cross mapping (CCM) analytical techniques. These methods employ conceptually different approaches to detect the relative strength and directionality of potentially chaotic and nonlinearly coupled systems. IT is a non-parametric measure of information transfer between data records based on transfer entropy, while CCM relies on delay reconstructions using Takens' theorem. This approach enables us to address how the climate system processes interact and how this interaction is affected by external forcing from for example greenhouse gases and orbital variability.
The Rurrand Fault, Germany: A Holocene surface rupture and new slip rate estimates
NASA Astrophysics Data System (ADS)
Grützner, Christoph; Fischer, Peter; Reicherter, Klaus
2016-04-01
Very low deformation rates in continental interiors are a challenge for research on active tectonics and seismic hazard. Faults tend to have very long earthquake recurrence intervals and morphological evidence of surface faulting is often obliterated by erosion and sedimentation. The Lower Rhine Graben in Central Europe is characterized by slow active faults with individual slip rates of well less than 0.1 mm/a. As a consequence, most geodetic techniques fail to record tectonic motions and the morphological expression of the faults is subtle. Although damaging events are known from this region, e.g. the 1755/56 Düren earthquakes series, there is no account for surface rupturing events in instrumental and historical records. Owing to the short temporal coverage with respect to the fault recurrence intervals, these records probably fail to depict the maximum possible magnitudes. In this study we used morphological evidence from a 1 m airborne LiDAR survey, near surface geophysics, and paleoseismological trenching to identify surface rupturing earthquakes at the Rurrand Fault between Cologne and Aachen in W Germany. LiDAR data allowed identifying a young fault strand parallel to the already known main fault with the subtle morphological expression of recent surface faulting. In the paleoseismological trenches we found evidence for two surface rupturing earthquakes. The most recent event occurred in the Holocene, and a previous earthquake probably happened in the last 150 ka. Geophysical data allowed us to estimate a minimum slip rate of 0.03 mm/a from an offset gravel horizon. We estimate paleomagnitudes of MW5.9-6.8 based on the observed offsets in the trench (<0.5 m per event) and fault scaling relationships. Our data imply that the Rurrand Fault did not creep during the last 150 ka, but rather failed in large earthquakes. These events were much stronger than those known from historical sources. We are able to show that the Rurrand Fault did not rupture the surface during the Düren 1755/56 seismic crisis and conclude that these events likely occurred on another nearby fault system or did not rupture the surface at all. The very long recurrence interval of 25-65 ka for surface rupturing events illustrates the problems of assessing earthquake hazard in such slowly deforming regions. We emphasize that geological data must be included in seismic hazard and surface rupture hazard assessments in order to obtain a complete picture of a region's seismic potential.
A System for Traffic Violation Detection
Aliane, Nourdine; Fernandez, Javier; Mata, Mario; Bemposta, Sergio
2014-01-01
This paper describes the framework and components of an experimental platform for an advanced driver assistance system (ADAS) aimed at providing drivers with a feedback about traffic violations they have committed during their driving. The system is able to detect some specific traffic violations, record data associated to these faults in a local data-base, and also allow visualization of the spatial and temporal information of these traffic violations in a geographical map using the standard Google Earth tool. The test-bed is mainly composed of two parts: a computer vision subsystem for traffic sign detection and recognition which operates during both day and nighttime, and an event data recorder (EDR) for recording data related to some specific traffic violations. The paper covers firstly the description of the hardware architecture and then presents the policies used for handling traffic violations. PMID:25421737
A system for traffic violation detection.
Aliane, Nourdine; Fernandez, Javier; Mata, Mario; Bemposta, Sergio
2014-11-24
This paper describes the framework and components of an experimental platform for an advanced driver assistance system (ADAS) aimed at providing drivers with a feedback about traffic violations they have committed during their driving. The system is able to detect some specific traffic violations, record data associated to these faults in a local data-base, and also allow visualization of the spatial and temporal information of these traffic violations in a geographical map using the standard Google Earth tool. The test-bed is mainly composed of two parts: a computer vision subsystem for traffic sign detection and recognition which operates during both day and nighttime, and an event data recorder (EDR) for recording data related to some specific traffic violations. The paper covers firstly the description of the hardware architecture and then presents the policies used for handling traffic violations.
New data of the Gakkel Ridge seismicity
NASA Astrophysics Data System (ADS)
Antonovskaya, Galina; Basakina, Irina; Kremenetskaya, Elena
2016-04-01
250 earthquakes were recorded in the Gakkel Ridge during the period 2012-2014 by Arkhangelsk seismic network. The magnitude Ml of these earthquakes is 1.5 - 5.7, 70% of them have Ml up to 3.0. Seismic events are arranged along to a narrow center line of the Mid-Arctic Ridge, most of the earthquakes are confined to the southern board of the Ridge. Presumably it's connected with the reflection of spreading processes. The high seismic activity zones, which we associate with the volcano-tectonic processes, have been identified. Have been recorded 13 events per day in the Western Volcanic Zone. The largest number of events (75%) is confined to the Sparsely Magmatic Zone. About 30% of all recorded earthquakes with magnitudes above 2.9 have a T-phase. We divided the Gakkel Ridge's earthquakes into two groups by using spectral-time analysis. The first group: maximum energy of the earthquake is observed from 1.5 to 10 Hz, values of magnitudes Ml 2.50-5.29. Earthquakes are distributed along the Gakkel Ridge. The second group: maximum energy of the earthquake is observed from 1.5 to 20 Hz, clearly expressed a high-frequency component, values of magnitudes Ml 2.3-3.4. Earthquakes 2 groups focused only in the Sparsely Magmatic Zone. The new seismic data shows an unique information about geodynamic processes of the Gakkel Ridge.
DAS Microseismic and Strain Monitoring During Hydraulic Fracturing
NASA Astrophysics Data System (ADS)
Kahn, D.; Karrenbach, M. H.; Cole, S.; Boone, K.; Ridge, A.; Rich, J.; Langton, D.; Silver, K.
2017-12-01
Hydraulic fracturing operations in unconventional subsurface reservoirs are typically monitored using geophones located either at the surface or in adjacent wellbores. A novel approach to record hydraulic stimulations utilizes fiber-optic Distributed Acoustic Sensing (DAS). A fiber-optic cable was installed in a treatment well in a subsurface reservoir (Meramec formation). DAS data were recorded during fluid injection of same fibered well and also during injection into a nearby treatment well at a distance of 350m. For both scenarios the DAS sensing array consisted of approximately 1000 channels at a fine spatial and temporal sampling and with a large sensing aperture. Thus, the full strain wave field is measured along the borehole over its entire length. A variety of physical effects, such as temperature, low-frequency strain and microseismicity were measured and correlated with the treatment program during hydraulic fracturing of the wells. These physical effects occur at various frequency scales and produce complementary measurements. Microseismic events in the magnitude range of -0.5 and -2.0 at a maximum distance of 500m were observed and analyzed for recordings from the fiber-equipped treatment well and also neighboring treatment well. The analysis of this DAS data set demonstrates that current fiber-optic sensing technology can provide enough sensitivity to detect a significant number of microseismic events and that these events can be integrated with temperature and strain measurements for an improved subsurface reservoir description.
Analysis of strong scintillation events by using GPS data at low latitudes
NASA Astrophysics Data System (ADS)
Forte, Biagio; Jakowski, Norbert; Wilken, Volker
2010-05-01
Drifting structures charaterised by inhomogeneities in the spatial electron density distribution at ionospheric heights originate scintillation of radio waves propagating through. The fractional electron density fluctuations and the corresponding scintillation levels may reach extreme values at low latitudes during high solar activity. Strong scintillation events have disruptive effects on a number of technological applications. In particular, operations and services based on GPS signals and receivers may experience severe disruption due to a significant degradation of the signal-to-noise ratio, eventually leading to signal loss of lock. Experimental scintillation data collected in the Asian sector at low latitudes by means of a GPS dual frequency receiver under moderate solar activity (2006) have been analysed. The GPS receiver is particularly modified in firmware in order to record power estimates on the C/A code as well as on the carriers L1 and L2. Strong scintillation activity is recorded in the post-sunset period (saturating S4 and SI as high as 20 dB). An overview of these events is presented, by taking into account scintillation impact on the signal intensity, phase, and dynamics. In particular, the interpretation of these events based on a refined scattering theory is provided with possible consequences for standard scintillation models.
On decomposing stimulus and response waveforms in event-related potentials recordings.
Yin, Gang; Zhang, Jun
2011-06-01
Event-related potentials (ERPs) reflect the brain activities related to specific behavioral events, and are obtained by averaging across many trial repetitions with individual trials aligned to the onset of a specific event, e.g., the onset of stimulus (s-aligned) or the onset of the behavioral response (r-aligned). However, the s-aligned and r-aligned ERP waveforms do not purely reflect, respectively, underlying stimulus (S-) or response (R-) component waveform, due to their cross-contaminations in the recorded ERP waveforms. Zhang [J. Neurosci. Methods, 80, pp. 49-63, 1998] proposed an algorithm to recover the pure S-component waveform and the pure R-component waveform from the s-aligned and r-aligned ERP average waveforms-however, due to the nature of this inverse problem, a direct solution is sensitive to noise that disproportionally affects low-frequency components, hindering the practical implementation of this algorithm. Here, we apply the Wiener deconvolution technique to deal with noise in input data, and investigate a Tikhonov regularization approach to obtain a stable solution that is robust against variances in the sampling of reaction-time distribution (when number of trials is low). Our method is demonstrated using data from a Go/NoGo experiment about image classification and recognition.
Sundvall, Erik; Nyström, Mikael; Forss, Mattias; Chen, Rong; Petersson, Håkan; Ahlfeldt, Hans
2007-01-01
This paper describes selected earlier approaches to graphically relating events to each other and to time; some new combinations are also suggested. These are then combined into a unified prototyping environment for visualization and navigation of electronic health records. Google Earth (GE) is used for handling display and interaction of clinical information stored using openEHR data structures and 'archetypes'. The strength of the approach comes from GE's sophisticated handling of detail levels, from coarse overviews to fine-grained details that has been combined with linear, polar and region-based views of clinical events related to time. The system should be easy to learn since all the visualization styles can use the same navigation. The structured and multifaceted approach to handling time that is possible with archetyped openEHR data lends itself well to visualizing and integration with openEHR components is provided in the environment.
Infrasonic emissions from local meteorological events: A summary of data taken throughout 1984
NASA Technical Reports Server (NTRS)
Zuckerwar, A. J.
1986-01-01
Records of infrasonic signals, propagating through the Earth's atmosphere in the frequency band 2 to 16 Hz, were gathered on a three microphone array at Langley Research Center throughout the year 1984. Digital processing of these records fulfilled three functions: time delay estimation, based on an adaptive filter; source location, determined from the time delay estimates; and source identification, based on spectral analysis. Meteorological support was provided by significant meteorological advisories, lightning locator plots, and daily reports from the Air Weather Service. The infrasonic data are organized into four characteristic signatures, one of which is believed to contain emissions from local meteorological sources. This class of signature prevailed only on those days when major global meteorological events appeared in or near to eastern United States. Eleven case histories are examined. Practical application of the infrasonic array in a low level wing shear alert system is discussed.
Sources of Infrasound events listed in IDC Reviewed Event Bulletin
NASA Astrophysics Data System (ADS)
Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick
2017-04-01
Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.
Strong-Motion Program report, January-December 1985
Porcella, R. L.
1989-01-01
This Program Report contains preliminary information on the nature and availability of strong-motion data recorded by the U.S. Geological Survey (USGS). The Strong-Motion Program is operated by the USGS in cooperation with numerous Federal, State, and local agencies and private organizations. Major objective of this program are to record both strong ground motion and the response of various types of engineered structures during earthquakes, and to disseminate this information and data to the international earthquake-engineering research and design community. This volume contains a summary of the accelerograms recovered from the USGS National Strong-Motion Instrumentation Network during 1985, summaries of recent strong-motion publications, notes on the availability of digitized data, and general information related to the USGS and other strong-motion programs. The data summary in table 1 contains information on all USGS accelerograms recovered (though not necessarily recorded) during 1985; event data are taken from "Preliminary Determination of Epicenters," published by the USGS.
Matsumura, Chikako; Chisaki, Yugo; Sakimoto, Satoko; Sakae, Honoka; Yano, Yoshitaka
2018-01-01
Purpose We aimed to examine the risk factors, time of onset, incidence rates, and outcomes of thromboembolic events induced by bevacizumab in patients with cancer using the Japanese Adverse Drug Event Report (JADER) database of the Pharmaceuticals and Medical Devices Agency. Methods Adverse event data recorded in the JADER database between January 2004 and January 2015 were used. After screening the data using the generic drug name bevacizumab, patient data were classified into two groups by age and five groups by cancer type. The histories of disorders were also categorized. Arterial thromboembolic event and venous thromboembolic event were classified as "favorable" or "unfavorable" outcomes. Results In total, 6076 patients were reported to have developed adverse events during the sample period, of which 233 and 453 developed arterial thromboembolic event and venous thromboembolic event, respectively. Logistic analysis suggested that the presence of cancer was a significant risk factor for both arterial thromboembolic event and venous thromboembolic event. Age (≥70 years), histories of either hypertension or diabetes mellitus were also risk factors for arterial thromboembolic event. Median cumulative times of onset for arterial thromboembolic event and venous thromboembolic event were 60 and 80 days, respectively, and were not significantly different by the log-rank test. By the chi-square test, the rate of unfavorable outcomes was found to be higher after developing arterial thromboembolic event than after venous thromboembolic event. Conclusion Thromboembolism is a leading cause of mortality in patients with cancer. Patients should be monitored for the symptoms of thromboembolic events right from the initial stages of bevacizumab treatment.
Longitudinal medical records as a complement to routine drug safety signal analysis†
Watson, Sarah; Sandberg, Lovisa; Johansson, Jeanette; Edwards, I. Ralph
2015-01-01
Abstract Purpose To explore whether and how longitudinal medical records could be used as a source of reference in the early phases of signal detection and analysis of novel adverse drug reactions (ADRs) in a global pharmacovigilance database. Methods Drug and ADR combinations from the routine signal detection process of VigiBase® in 2011 were matched to combinations in The Health Improvement Network (THIN). The number and type of drugs and ADRs from the data sets were investigated. For unlabelled combinations, graphical display of longitudinal event patterns (chronographs) in THIN was inspected to determine if the pattern supported the VigiBase combination. Results Of 458 combinations in the VigiBase data set, 190 matched to corresponding combinations in THIN (after excluding drugs with less than 100 prescriptions in THIN). Eighteen percent of the VigiBase and 9% of the matched THIN combinations referred to new drugs reported with serious reactions. Of the 112 unlabelled combinations matched to THIN, 52 chronographs were inconclusive mainly because of lack of data; 34 lacked any outstanding pattern around the time of prescription; 24 had an elevation of events in the pre‐prescription period, hence weakened the suspicion of a drug relationship; two had an elevated pattern of events exclusively in the post‐prescription period that, after review of individual patient histories, did not support an association. Conclusions Longitudinal medical records were useful in understanding the clinical context around a drug and suspected ADR combination and the probability of a causal relationship. A drawback was the paucity of data for newly marketed drugs with serious reactions. © 2015 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons, Ltd. PMID:25623045
Correlation of lithologic and sonic logs from the COST No. B-2 well with seismic reflection data
King, K.C.
1979-01-01
The purpose of this study was to correlate events recorded on seismic records with changes in lithology recorded from sample descriptions from the Continental Offshore Stratigraphic Test (COST) No. B-2 well. The well is located on the U.S. mid-Atlantic Outer Continental Shelf about 146 km east of Atlantic City, N.J. (see location map). Lithologic data are summarized from the sample descriptions of Smith and others (1976). Sonic travel times were read at 0.15 m intervals in the well using a long-space sonic logging tool. Interval velocities, reflection coefficients and a synthetic seismogram were calculated from the sonic log.
NASA Astrophysics Data System (ADS)
Wadey, M. P.; Brown, J. M.; Haigh, I. D.; Dolphin, T.; Wisse, P.
2015-10-01
The extreme sea levels and waves experienced around the UK's coast during the 2013/14 winter caused extensive coastal flooding and damage. Coastal managers seek to place such extremes in relation to the anticipated standards of flood protection, and the long-term recovery of the natural system. In this context, return periods are often used as a form of guidance. This paper provides these levels for the winter storms, and discusses their application to the given data sets for two UK case study sites: Sefton, northwest England, and Suffolk, east England. Tide gauge records and wave buoy data were used to compare the 2013/14 storms with return periods from a national data set, and also joint probabilities of sea level and wave heights were generated, incorporating the recent events. The 2013/14 high waters and waves were extreme due to the number of events, as well as the extremity of the 5 December 2013 "Xaver" storm, which had a high return period at both case study sites. The national-scale impact of this event was due to its coincidence with spring high tide at multiple locations. Given that this event is such an outlier in the joint probability analyses of these observed data sets, and that the season saw several events in close succession, coastal defences appear to have provided a good level of protection. This type of assessment could in the future be recorded alongside defence performance and upgrade. Ideally other variables (e.g. river levels at estuarine locations) would also be included, and with appropriate offsetting for local trends (e.g. mean sea-level rise) so that the storm-driven component of coastal flood events can be determined. This could allow long-term comparison of storm severity, and an assessment of how sea-level rise influences return levels over time, which is important for consideration of coastal resilience in strategic management plans.
Merino-Sáinz, Izaskun; Torralba-Burrial, Antonio; Anadón, Araceli
2014-01-01
In this study, we analyse the relevance of harvestmen distribution data derived from opportunistic, unplanned, and non-standardised collection events in an area in the north of the Iberian Peninsula. Using specimens deposited in the BOS Arthropod Collection at the University of Oviedo, we compared these data with data from planned, standardised, and periodic collections with pitfall traps in several locations in the same area. The Arthropod Collection, begun in 1977, includes specimens derived from both sampling types, and its recent digitisation allows for this type of comparative analysis. Therefore, this is the first data-paper employing a hybrid approach, wherein subset metadata are described alongside a comparative analysis. The full dataset can be accessed through Spanish GBIF IPT at http://www.gbif.es:8080/ipt/archive.do?r=Bos-Opi, and the metadata of the unplanned collection events at http://www.gbif.es:8080/ipt/resource.do?r=bos-opi_unplanned_collection_events. We have mapped the data on the 18 harvestmen species included in the unplanned collections and provided records for some species in six provinces for the first time. We have also provided the locations of Phalangium opilio in eight provinces without published records. These results highlight the importance of digitising data from unplanned biodiversity collections, as well as those derived from planned collections, especially in scarcely studied groups and areas.
Merino-Sáinz, Izaskun; Torralba-Burrial, Antonio; Anadón, Araceli
2014-01-01
Abstract In this study, we analyse the relevance of harvestmen distribution data derived from opportunistic, unplanned, and non-standardised collection events in an area in the north of the Iberian Peninsula. Using specimens deposited in the BOS Arthropod Collection at the University of Oviedo, we compared these data with data from planned, standardised, and periodic collections with pitfall traps in several locations in the same area. The Arthropod Collection, begun in 1977, includes specimens derived from both sampling types, and its recent digitisation allows for this type of comparative analysis. Therefore, this is the first data-paper employing a hybrid approach, wherein subset metadata are described alongside a comparative analysis. The full dataset can be accessed through Spanish GBIF IPT at http://www.gbif.es:8080/ipt/archive.do?r=Bos-Opi, and the metadata of the unplanned collection events at http://www.gbif.es:8080/ipt/resource.do?r=bos-opi_unplanned_collection_events. We have mapped the data on the 18 harvestmen species included in the unplanned collections and provided records for some species in six provinces for the first time. We have also provided the locations of Phalangium opilio in eight provinces without published records. These results highlight the importance of digitising data from unplanned biodiversity collections, as well as those derived from planned collections, especially in scarcely studied groups and areas. PMID:24843271
Szőllősi, Ágnes; Keresztes, Attila; Conway, Martin A; Racsmány, Mihály
2015-01-01
Recording the events of a day in a diary may help improve their later accessibility. An interesting question is whether improvements in long-term accessibility will be greater if the diary is completed at the end of the day, or after a period of sleep, the following morning. We investigated this question using an internet-based diary method. On each of five days, participants (n = 109) recorded autobiographical memories for that day or for the previous day. Recording took place either in the morning or in the evening. Following a 30-day retention interval, the diary events were free recalled. We found that participants who recorded their memories in the evening before sleep had best memory performance. These results suggest that the time of reactivation and recording of recent autobiographical events has a significant effect on the later accessibility of those diary events. We discuss our results in the light of related findings that show a beneficial effect of reduced interference during sleep on memory consolidation and reconsolidation.
The fossil record of evolution: Data on diversification and extinction
NASA Technical Reports Server (NTRS)
Sepkoski, J. J., Jr.
1986-01-01
Synoptic studies of the fossil record of complex life on Earth indicate increasingly that extinction, and especially mass extinction, were extremely important driving forces in the history of life. Analysis of a new compilation of geologic ranges for 25,000 genera of marine animals suggests that extinction events were much more frequent in occurrence and variable in magnitude than previously suspected. At least 30 well documented and potential mass extinctions were identified in the dataset. The most recent event, distributed over 260 to 0 ma. exhibit a stationary periodicity of 26.1 + or - 1 ma, implicating a cosmological forcing mechanism. Earlier events, especially in the 575 to 450 ma interval, are more frequent, possibly indicating either a breakdown of periodicity in the more distant past; and as yet undemonstrated diminution of the period length; or frequent aperiodic terrestrial perturbations of a less stable biota superimposed upon the cosmological periodicity.
Neurological and cardiac complications in a cohort of children with end-stage renal disease.
Albaramki, Jumana H; Al-Ammouri, Iyad A; Akl, Kamal F
2016-05-01
Adult patients with chronic kidney disease are at risk of major neurologic and cardiac complications. The purpose of this study is to review the neurological and cardiac complications in children with end-stage renal disease (ESRD). A retrospective review of medical records of children with ESRD at Jordan University Hospital was performed. All neurological and cardiac events were recorded and analyzed. Data of a total of 68 children with ESRD presenting between 2002 and 2013 were reviewed. Neurological complications occurred in 32.4%; seizures were the most common event. Uncontrolled hypertension was the leading cause of neurological events. Cardiac complications occurred in 39.7%, the most common being pericardial effusion. Mortality from neurological complications was 45%. Neurological and cardiac complications occurred in around a third of children with ESRD with a high mortality rate. More effective control of hypertension, anemia, and intensive and gentle dialysis are needed.
NASA Astrophysics Data System (ADS)
Green, D. N.; Neuberg, J.
2005-04-01
In March 2004, during a period of no magma extrusion at Soufrière Hills volcano, Montserrat, an explosive event occurred with little precursory activity. Recorded broadband seismic signals ranged from an ultra-long-period signal with a dominant period of 120 s to impulsive, short-duration events containing frequencies up to 30 Hz. Synthetic displacement functions were fit to the long-period data after application of the seismometer response. These indicate a shallow collapse of the volcanic edifice occurred, initiated ~300 m below the surface, lasting ~100 s. Infrasonic tremor and pulses were also recorded in the 1-20 Hz range. The high-frequency seismicity and infrasound are interpreted as the subsequent collapse of a gravitationally unstable buttress of remnant dome material which impacted upon the edifice surface. This unique dataset demonstrates the benefits of deploying multi-parameter stations equipped with broadband instruments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.
The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 23 local earthquakes during the third quarter of FY 2010. Sixteen earthquakes were located at shallow depths (less than 4 km), five earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and two earthquakes were located at depths greater than 9 km, within the basement. Geographically, twelve earthquakes were located in known swarm areas, 3 earthquakes occurred near a geologic structure (Saddle Mountain anticline), and eight earthquakes were classified as random events. The highest magnitude event (3.0 Mc) was recorded on May 8, 2010 at depth 3.0 km with epicenter located near the Saddle Mountain anticline. Later in the quarter (May 24 and June 28) two additional earthquakes were also recorded nearly at the same location. These events are not considered unusual in that earthquakes have been previously recorded at this location, for example, in October 2006 (Rohay et al; 2007). Six earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this quarter were a continuation of the swarm events observed during the 2009 and 2010 fiscal years and reported in previous quarterly and annual reports (Rohay et al; 2009a, 2009b, 2009c, 2010a, and 2010b). All events were considered minor (coda-length magnitude [Mc] less than 1.0) with a maximum depth estimated at 1.7 km. Based upon this quarters activity it is likely that the Wooded Island swarm has subsided. Pacific Northwest National Laboratory (PNNL) will continue to monitor for activity at this location.« less
Adverse events in British hospitals: preliminary retrospective record review
Vincent, Charles; Neale, Graham; Woloshynowych, Maria
2001-01-01
Objectives To examine the feasibility of detecting adverse events through record review in British hospitals and to make preliminary estimates of the incidence and costs of adverse events. Design Retrospective review of 1014 medical and nursing records. Setting Two acute hospitals in Greater London area. Main outcome measure Number of adverse events. Results 110 (10.8%) patients experienced an adverse event, with an overall rate of adverse events of 11.7% when multiple adverse events were included. About half of these events were judged preventable with ordinary standards of care. A third of adverse events led to moderate or greater disability or death. Conclusions These results suggest that adverse events are a serious source of harm to patients and a large drain on NHS resources. Some are major events; others are frequent, minor events that go unnoticed in routine clinical care but together have massive economic consequences. PMID:11230064
NASA Astrophysics Data System (ADS)
Dostal, P.; Seidel, J.; Imbery, F.
2010-09-01
A 500 year climate reconstruction of Southwest Germany based on documentary and direct data with a special focus on high resolute reconstructed extreme rain events Against the background of an increasing world population and the changes that this is causing to the earth, the increasing industrialisation resulting in more emissions of greenhouse gases, it is indispensable to differentiate between natural and anthropogenic climate changes. This applies equally to global as well as regional climates. Due to the fact, that the weather data measurement series in the upper Rhine valley go back a maximum of 150 years, it is not possible to use this data to grasp long term climate fluctuations. For example, the current climate is integrated in long scale climate cycles which last thousands of years. To describe these changes accurately, it is necessary to reconstruct the climate beyond that of instrumental series measurements. With the application of direct and indirect data (proxy data) a climate reconstruction will be attempted for the area of region TriRhena. With the application of documentary data it is possible to reconstruct the climate prior to instrumental measurements. These historical records are made up of, for e.g. weather descriptions, information about the wine harvest and other agricultural products, as well as their price fluctuations. Using this data it is possible to calculate meteorological parameters creating an index of air temperature and precipitation values. Climate is an integration of weather and therefore its worth to set the focus also on single interesting weather events. Especially extreme events can contribute to the thesis "learning from the past for a better future". Aim of the research is to identify and apply extreme flood events of the past 500 years as a basis for further analysis like a contribution to improve current flood hazard maps. The data which will be presented were extracted from historical records such as local annuals and chronologies from 1500-1900 and supplemented by instrumental observations since 1755.
Flexible data-management system
NASA Technical Reports Server (NTRS)
Pelouch, J. J., Jr.
1977-01-01
Combined ASRDI Data-Management and Analysis Technique (CADMAT) is system of computer programs and procedures that can be used to conduct data-management tasks. System was developed specifically for use by scientists and engineers who are confronted with management and analysis of large quantities of data organized into records of events and parametric fields. CADMAT is particularly useful when data are continually accumulated, such as when the need of retrieval and analysis is ongoing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... dynamic time-series data during the time period just prior to a crash event (e.g., vehicle speed vs. time... EDR data in a temporary, volatile storage medium where it is continuously updated at regular time..., along the lateral axis, starting from crash time zero and ending at 0.25 seconds, recorded every 0.01...
Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording
ERIC Educational Resources Information Center
Mayer, Kimberly L.; DiGennaro Reed, Florence D.
2013-01-01
Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…
14 CFR 221.300 - Discontinuation of electronic tariff system.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Discontinuation of electronic tariff system... of electronic tariff system. In the event that the electronic tariff system is discontinued, or the source of the data is changed, or a filer discontinues its business, all electronic data records prior to...
14 CFR 221.300 - Discontinuation of electronic tariff system.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Discontinuation of electronic tariff system... of electronic tariff system. In the event that the electronic tariff system is discontinued, or the source of the data is changed, or a filer discontinues its business, all electronic data records prior to...
14 CFR 221.300 - Discontinuation of electronic tariff system.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Discontinuation of electronic tariff system... of electronic tariff system. In the event that the electronic tariff system is discontinued, or the source of the data is changed, or a filer discontinues its business, all electronic data records prior to...
14 CFR 221.300 - Discontinuation of electronic tariff system.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Discontinuation of electronic tariff system... of electronic tariff system. In the event that the electronic tariff system is discontinued, or the source of the data is changed, or a filer discontinues its business, all electronic data records prior to...
Inversion of Orkney M5.5 earthquake South Africa using strain meters at very close distances
NASA Astrophysics Data System (ADS)
Yasutomi, T.; Mori, J. J.; Yamada, M.; Ogasawara, H.; Okubo, M.; Ogasawara, H.; Ishida, A.
2017-12-01
The largest event recorded in a South African gold mining region, a M5.5 earthquake took place near Orkney on 5 August 2014. The mainshock and afterhocks were recorded by 46 geophones at 2-3 km depths, 3 Ishii borehole strainmeters at 2.9km depth, and 17 surface strong motion instruments at close distances. The upper edge of the planar distribution of aftershock activity dips almost vertically and was only several hundred meters below the sites where the strainmeters were installed. In addition the seismic data, drilling across this fault is now in progress (Jun 2017 to December 2017) and will contribute valuable geological and stress information. Although the geophones data were saturated during the mainshock, the strainmeters recorded clear nearfield waveforms. We try to model the source of the M5.5 mainshock using the nearfield strainmeter data. Two strain meters located at same place, depth at 2.8km. Remaining one is located depth at 2.9km. Distance of each other is only 150m. Located at depth 2.9km recorded large stable strain, on the other hand, located at depth 2.8 km recorded three or four times smaller stable strain than 2.9km. These data indicates the distance between M5.5 fault and 2.9km depth strainmeter is a few hundred meters order. The strain Green functions were calculated assuming an infinite medium and using a finite difference method. We use small aftershocks to verify the Green function. Matching of the waveforms for the small events validates and Green functions used for the mainshock inversion. We present a model of the source rupture using these strain data. The nearfield data provide good resolution of the nearby earthquake rupture. There are two large subevents, one near the hypocenter and the second several hundred meters to the west.
The Absence of Remotely Triggered Seismicity in Japan from 1997 to 2002
NASA Astrophysics Data System (ADS)
Wakefield, R. H.; Brodsky, E. E.
2003-12-01
Observations of increased seismicity following the Landers, Hector Mine, Izmit, and the Denali, earthquakes suggests remote seismic triggering occurs in geothermal locations as far as 3150 km. This study attempts to determine if the same effects occur in Japan, a geothermal region of high seismicity. For the period of 1997 to 2002, we searched for significant increases in the seismicity levels following earthquakes with Mw >= 6.5 at distances larger than conventionally associated with aftershocks. Additionally, we examined available waveform data in order to detect uncataloged events hidden by the coda of the mainshock. Five events had associated waveform data: March 24, 2001 Geiyo, Mw = 6.8; March 28, 2000 Volcano Islands, Mw = 7.6; July 30, 2000 Honshu, Mw = 6.5; October 6, 2000 Tottori, Mw = 6.7; and the January 28, 1999 Kuril Islands, Mw = 6.8 earthquake. Located 260 km from the Geiyo epicenter, station TKO recorded one possible triggered event within 65 km during the hour following the mainshock. However, the TKO data contains many anomalous spikes, and we are not confident the record is clear enough to differentiate small local events from noise. An ambiguous, two-day, regional seismicity increase followed the Volcano Islands event. We interpret the swarm associated with the signal as coincidental because no similar swarms occurred at the same location following Tottori or Geiyo, both of which had an order of magnitude larger shaking. Both waveforms and cataloged events indicate no triggering occurred following the Honshu, Tottori and Kuril Islands mainshocks. We do not interpret the one indefinite local event recorded by TKO as evidence for mid range dynamic triggering, implying that the 2.5 cm/s shaking at TKO did not exceed the local triggering threshold. Additionally, the lack of triggering following Honshu, Tottori, and Kuril Islands suggests that the 1, 2.5 and 2.6 cm/s shaking at distances of 182, 238, and 267 km, respectively, creates lower bounds for the dynamic triggering thresholds at the respective locations. This assumes the bound is frequency independent. In none of the cases were thresholds exceeded over a large enough region or by large enough amplitude to produce a statistically significant increase in the cataloged rate of seismicity during the period from 1997 to 2002. All previously documented examples of triggering have occurred following shallow earthquakes with Mw > 7. With the exception of Volcano Islands, all of the events of this study have Mw < 7, and have no triggering associated with them. This suggests two possibilities: either events with Mw > 7 are required to produce sufficient shaking to trigger seismicity, or Japan is less susceptible to triggering than the western US or Greece. We assume that the depth of the Volcano Islands earthquake prohibits any substantial surface shaking. We conclude that more data is required associated with shallow, crustal events with Mw > 7 in order to determine whether or not Japan is susceptible to regional triggering.
Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.
Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V
2017-08-01
Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664 ± 619 rad/s). The current data indicate that existing wearable sensor technologies may substantially overestimate head impact events. Further, while the wearable sensors always estimated a head impact location, only 48% of the impacts were a result of direct contact to the head as characterized on video. Using wearable sensors and video to verify head impacts may decrease the inclusion of false-positive impacts during game activity in the analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.
Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less
Effective Sharing of Health Records, Maintaining Privacy: A Practical Schema
Neame, Roderick
2013-01-01
A principal goal of computerisation of medical records is to join up care services for patients, so that their records can follow them wherever they go and thereby reduce delays, duplications, risks and errors, and costs. Healthcare records are increasingly being stored electronically, which has created the necessary conditions for them to be readily sharable. However simply driving the implementation of electronic medical records is not sufficient, as recent developments have demonstrated (1): there remain significant obstacles. The three main obstacles relate to (a) record accessibility (knowing where event records are and being able to access them), (b) maintaining privacy (ensuring that only those authorised by the patient can access and extract meaning from the records) and (c) assuring the functionality of the shared information (ensuring that the records can be shared non-proprietorially across platforms without loss of meaning, and that their authenticity and trustworthiness are demonstrable). These constitute a set of issues that need new thinking, since existing systems are struggling to deliver them. The solution to this puzzle lies in three main parts. Clearly there is only one environment suited to such widespread sharing, which is the World Wide Web, so this is the communications basis. Part one requires that a sharable synoptic record is created for each care event and stored in standard web-format and in readily accessible locations, on ‘the web’ or in ‘the cloud’. To maintain privacy these publicly-accessible records must be suitably protected either stripped of identifiers (names, addresses, dates, places etc.) and/or encrypted: either way the record must be tagged with a tag that means nothing to anyone, but serves to identify and authenticate a specific record when retrieved. For ease of retrieval patients must hold an index of care events, records and web locations (plus any associated information for each such as encryption keys, context etc.). For added security, as well as for trustworthiness, a method of verifying authenticity, integrity and authorship is required, which can be provided using a public key infrastructure (PKI) for cryptography (2). The second part of the solution is to give control over record access and sharing to the patient (or their identified representative), enabling them to authorise access by providing the index and access keys to their records. This can be done using a token (fe.g. smart card) or a secure online index which holds these details: this serves to relieve the formal record keeper of responsibility for external access control and privacy (internal access control and privacy can remain an institutional responsibility). The third part of the solution is to process the content of the stored records such that there is a ‘plain English’ copy, as well as an electronic copy which is coded and marked up using XML tags for each data element to signify ‘type’ (e.g. administrative, financial, operational, clinical etc.) and sub-types (e.g. diagnosis, medication, procedure, investigation result etc.). This ensures that the recipient can always read the data using a basic browser, but can readily manipulate and re-arrange the data for display and storage if they have a more sophisticated installation. PMID:23923101
Effective sharing of health records, maintaining privacy: a practical schema.
Neame, Roderick
2013-01-01
A principal goal of computerisation of medical records is to join up care services for patients, so that their records can follow them wherever they go and thereby reduce delays, duplications, risks and errors, and costs. Healthcare records are increasingly being stored electronically, which has created the necessary conditions for them to be readily sharable. However simply driving the implementation of electronic medical records is not sufficient, as recent developments have demonstrated (1): there remain significant obstacles. The three main obstacles relate to (a) record accessibility (knowing where event records are and being able to access them), (b) maintaining privacy (ensuring that only those authorised by the patient can access and extract meaning from the records) and (c) assuring the functionality of the shared information (ensuring that the records can be shared non-proprietorially across platforms without loss of meaning, and that their authenticity and trustworthiness are demonstrable). These constitute a set of issues that need new thinking, since existing systems are struggling to deliver them. The solution to this puzzle lies in three main parts. Clearly there is only one environment suited to such widespread sharing, which is the World Wide Web, so this is the communications basis. Part one requires that a sharable synoptic record is created for each care event and stored in standard web-format and in readily accessible locations, on 'the web' or in 'the cloud'. To maintain privacy these publicly-accessible records must be suitably protected either stripped of identifiers (names, addresses, dates, places etc.) and/or encrypted: either way the record must be tagged with a tag that means nothing to anyone, but serves to identify and authenticate a specific record when retrieved. For ease of retrieval patients must hold an index of care events, records and web locations (plus any associated information for each such as encryption keys, context etc.). For added security, as well as for trustworthiness, a method of verifying authenticity, integrity and authorship is required, which can be provided using a public key infrastructure (PKI) for cryptography (2). The second part of the solution is to give control over record access and sharing to the patient (or their identified representative), enabling them to authorise access by providing the index and access keys to their records. This can be done using a token (fe.g. smart card) or a secure online index which holds these details: this serves to relieve the formal record keeper of responsibility for external access control and privacy (internal access control and privacy can remain an institutional responsibility). The third part of the solution is to process the content of the stored records such that there is a 'plain English' copy, as well as an electronic copy which is coded and marked up using XML tags for each data element to signify 'type' (e.g. administrative, financial, operational, clinical etc.) and sub-types (e.g. diagnosis, medication, procedure, investigation result etc.). This ensures that the recipient can always read the data using a basic browser, but can readily manipulate and re-arrange the data for display and storage if they have a more sophisticated installation.
NASA Astrophysics Data System (ADS)
Brázdil, R.; Chromá, K.; Valášek, H.; Dolák, L.
2012-03-01
Historical written records associated with tax relief at ten estates located in south-eastern Moravia (Czech Republic) are used for the study of hydrometeorological extremes and their impacts during the period 1751-1900 AD. At the time, the taxation system in Moravia allowed farmers to request tax relief if their crop yields had been negatively affected by hydrological and meteorological extremes. The documentation involved contains information about the type of extreme event and the date of its occurrence, while the impact on crops may often be derived. A total of 175 extreme events resulting in some kind of damage are documented for 1751-1900, with the highest concentration between 1811 and 1860 (74.9% of all events analysed). The nature of events leading to damage (of a possible 272 types) include hailstorm (25.7%), torrential rain (21.7%), flood (21.0%), followed by thunderstorm, flash flood, late frost and windstorm. The four most outstanding events, affecting the highest number of settlements, were thunderstorms with hailstorms (25 June 1825, 20 May 1847 and 29 June 1890) and flooding of the River Morava (mid-June 1847). Hydrometeorological extremes in the 1816-1855 period are compared with those occurring during the recent 1961-2000 period. The results obtained are inevitably influenced by uncertainties related to taxation records, such as their temporal and spatial incompleteness, the limits of the period of outside agricultural work (i.e. mainly May-August) and the purpose for which they were originally collected (primarily tax alleviation, i.e. information about hydrometeorological extremes was of secondary importance). Taxation records constitute an important source of data for historical climatology and historical hydrology and have a great potential for use in many European countries.
NASA Astrophysics Data System (ADS)
Brázdil, R.; Chromá, K.; Valášek, H.; Dolák, L.
2011-12-01
Historical written records associated with tax relief at ten estates located in south-eastern Moravia (Czech Republic) are used for the study of hydrometeorological extremes and their impacts during the period AD 1751-1900. At the time, the taxation system in Moravia allowed farmers to request tax relief if their crop yields had been negatively affected by hydrological and meteorological extremes. The documentation involved contains information about the type of extreme event and the date of its occurrence, while the impact on crops may often be derived. A total of 175 extreme events resulting in some kind of damage is documented for 1751-1900, with the highest concentration between 1811 and 1860 (74.9% of all events analysed). The nature of events leading to damage (of a possible 272 types) include hailstorm (25.7%), torrential rain (21.7%), and flood (21.0%), followed by thunderstorm, flash flood, late frost and windstorm. The four most outstanding events, affecting the highest number of settlements, were thunderstorms with hailstorms (25 June 1825, 20 May 1847 and 29 June 1890) and flooding of the River Morava (mid-June 1847). Hydrometeorological extremes in the 1816-1855 period are compared with those occurring during the recent 1961-2000 period. The results obtained are inevitably influenced by uncertainties related to taxation records, such as their temporal and spatial incompleteness, the limits of the period of outside agricultural work (i.e. mainly May-August) and the purpose for which they were originally collected (primarily tax alleviation, i.e. information about hydrometeorological extremes was of secondary importance). Taxation records constitute an important source of data for historical climatology and historical hydrology and have a great potential for use in many European countries.
Second Quarter Hanford Seismic Report for Fiscal Year 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.
2009-07-31
The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded over 800 local earthquakes during the second quarter of FY 2009. Nearly all of these earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. Most of the events were considered minor (magnitude (Mc) less than 1.0) with 19 events in the 2.0-2.9 range. The estimated depths of the Wooded Island events are shallow (averaging less than 1.0 km deep) with a maximum depth estimated at 1.9 km. This places the Wooded Island events within the Columbia River Basalt Group (CRBG). The low magnitude and the shallowness of the Wooded Island events have made them undetectable to most area residents. However, some Hanford employees working within a few miles of the area of highest activity, and individuals living in homes directly across the Columbia River from the swarm center, have reported feeling some movement. The Hanford SMA network was triggered numerous times by the Wooded Island swarm events. The maximum acceleration values recorded by the SMA network were approximately 2-3 times lower than the reportable action level for Hanford facilities (2% g) and no action was required. The swarming is likely due to pressures that have built up, cracking the brittle basalt layers within the Columbia River Basalt Formation (CRBG). Similar earthquake “swarms” have been recorded near this same location in 1970, 1975 and 1988. Prior to the 1970s, swarming may have occurred, but equipment was not in place to record those events. Quakes of this limited magnitude do not pose a risk to Hanford cleanup efforts or waste storage facilities. Since swarms of the past did not intensify in magnitude, seismologists do not expect that these events will increase in intensity. However, PNNL will continue to monitor the activity continuously. Outside of the Wooded Island swarm, four earthquakes were recorded. Three earthquakes were classified as minor and one event registered 2.3 Mc. One earthquake was located at intermediate depth (between 4 and 9 km, most likely in the pre-basalt sediments) and three earthquakes at depths greater than 9 km, within the basement. Geographically, two earthquakes were located in known swarm areas and two earthquakes were classified as random events.« less
The ADE scorecards: a tool for adverse drug event detection in electronic health records.
Chazard, Emmanuel; Băceanu, Adrian; Ferret, Laurie; Ficheur, Grégoire
2011-01-01
Although several methods exist for Adverse Drug events (ADE) detection due to past hospitalizations, a tool that could display those ADEs to the physicians does not exist yet. This article presents the ADE Scorecards, a Web tool that enables to screen past hospitalizations extracted from Electronic Health Records (EHR), using a set of ADE detection rules, presently rules discovered by data mining. The tool enables the physicians to (1) get contextualized statistics about the ADEs that happen in their medical department, (2) see the rules that are useful in their department, i.e. the rules that could have enabled to prevent those ADEs and (3) review in detail the ADE cases, through a comprehensive interface displaying the diagnoses, procedures, lab results, administered drugs and anonymized records. The article shows a demonstration of the tool through a use case.
NASA Astrophysics Data System (ADS)
Black, D. E.; Rahman, S.; Wurtzel, J.; Thunell, R.; Mauer, B.; Tappa, E. J.
2009-12-01
The Cariaco Basin, Venezuela is well-positioned to record a detailed history of surface ocean changes along the southern margin of the Caribbean and the tropical Atlantic. Varved, high deposition rate sediments deposited under anoxic conditions and an abundance of well-preserved microfossils result in one of the few marine records capable of preserving evidence of interannual- to decadal-scale climate variability in the tropical Atlantic. Boreal winter/spring sea surface temperatures (SST) spanning the last eight centuries have previously been reconstructed using Mg/Ca measurements on the planktic foraminifer Globigerina bulloides. Here we present the complementary record using Globigerinoides ruber (pink), a summer/fall indicator. Globigerinoides ruber Mg/Ca values are generally greater than those of G. bulloides from the same sample, reflecting warmer calcification temperatures. Both species’ records display similar long-term trends, yet there are some distinctive differences. The Medieval Warm Period (MWP) and Little Ice Age (LIA) as distinctly separate climate events are more apparent in the G. ruber record than that of G. bulloides. Additionally, greater variability in the G. ruber data may indicate a stronger than expected bias from productivity during the local upwelling season. As G. bulloides and pink G. ruber are thought to be winter/spring and summer/fall SST indicators, respectively (albeit with the potential upwelling season bias), the intersample differences between the two records can potentially be interpreted as a record of seasonality. Our seasonality reconstruction shows a distinctive oscillation of 4 °C with a period of approximately 200 years. The proxy seasonality is slightly less than what has been instrumentally measured (5 to 6 °C) over the last 15 years, and does not appear related to or affected by the MWP or LIA events.
Chase, C R; Ashikaga, T; Mazuzan, J E
1994-07-01
The objective of our study was to assess the acceptability of a proposed user interface to visually interfaced computer-assisted anesthesia record (VISI-CAARE), before the application was begun. The user interface was defined as the user display and its user orientation methods. We designed methods to measure user performance and attitude toward two different anesthesia record procedures: (1) the traditional pen and paper anesthetic record procedure of our hospital, and (2) VISI-CAARE. Performance measurements included the reaction speed (identifying the type and time of an event) and completion speed (describing the event). Performance also included accuracy of the recorded time of the event and accuracy of the description. User attitude was measured by (1) the physician's rating on a scale of 0 to 9 of the potential usefulness of computers in anesthesia care; (2) willingness to use the future application in the clinical environment; and (3) user suggestions for change. These measurements were used in a randomized trial of 21 physicians, of which data from 20 were available. After exposure to VISI-CAARE, the experimental subjects' ranking of computer usefulness in anesthesia care improved significantly (4.2 +/- 1.1 to 7.6 +/- 1.5, p = 0.0001), as did controls' (5.2 +/- 2.6 to 8 +/- 1.5, p = 0.0019). All the volunteers were willing to try the proposed prototype clinically, when it was ready. VISI-CAARE exposure was associated with faster and more accurate reaction to events over the traditional pen and paper machine, and slower and more accurate description of events in an artificial mock setting. VISI-CAARE 1.1 demonstrated significant improvements in both reaction speed and completion speed over VISI-CAARE 1.0, after changes were made to the user display and orientation methods. With graphic user interface prototyping environments, one can obtain preliminary user attitude and performance data, even before application programming is begun. This may be helpful in revising initial display and orientation methods, while obtaining user interest and commitment before actual programming and clinical testing.
Data collection of patients with diabetes in family medicine: a study in north-eastern Italy.
Vaona, Alberto; Del Zotti, Franco; Girotto, Sandro; Marafetti, Claudio; Rigon, Giulio; Marcon, Alessandro
2017-08-16
Studies on data collection and quality of care in Italian family medicine are lacking. The aim of this study was to assess the completeness of data collection of patients with diabetes in a large sample of family physicians in the province of Verona, Veneto region, a benchmark for the Italian National Health System. We extracted the data on all the patients with diabetes from the electronic health records of 270 family physicians in 2006 and 2009. We reported the percentage of patients with data recorded for 12 indicators of performance derived from the National Institute for Clinical Excellence diabetes guidelines. Secondarily, we assessed quality of care using the Q-score (the lower the score, the greater the risk of cardiovascular events). Patients with diabetes were 18,507 in 2006 and 20,744 in 2009, and the percentage of patients registered as having diabetes was 4.9% and 5.4% of the total population, respectively (p < 0.001). Data collection improved for all the indicators between 2006 and 2009 but the performance was still low at the end of the study period: patients with no data recorded were 42% in 2006 and 32% in 2009, while patients with data recorded for ≥5 indicators were 9% in 2006 and 17% in 2009. The Q-score improved (mean ± SD, 20.7 ± 3.0 in 2006 vs 21.3 ± 3.6 in 2009, p < 0.001) but most patients were at increased risk of cardiovascular events in both years (Q-score ≤ 20). We documented an improvement in data collection and quality of care for patients with diabetes during the study period. Nonetheless, data collection was still unsatisfactory in comparison with international benchmarks in 2009. Structural interventions in the organization of family medicine, which have not been implemented since the study period, should be prioritised in Italy.
Misclassification of OSA severity with automated scoring of home sleep recordings.
Aurora, R Nisha; Swartz, Rachel; Punjabi, Naresh M
2015-03-01
The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov.
Misclassification of OSA Severity With Automated Scoring of Home Sleep Recordings
Aurora, R. Nisha; Swartz, Rachel
2015-01-01
BACKGROUND: The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. METHODS: Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. RESULTS: Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. CONCLUSIONS: Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. TRIAL REGISTRY: ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov PMID:25411804
Heinrich 0 at the Younger Dryas Termination Offshore Newfoundland
NASA Astrophysics Data System (ADS)
Pearce, C.; Andrews, J. T.; Jennings, A. E.; Bouloubassi, I.; Seidenkrantz, M. S.; Kuijpers, A.; Hillaire-Marcel, C.
2014-12-01
The last deglaciation was marked by intervals of rapid climatic fluctuations accompanied by glacial advances and retreats along the eastern edge of the Laurentide ice sheet. The most severe of these events, the Younger Dryas cold reversal, was accompanied by the major detrital carbonate (DC) event generally referred to as "Heinrich event 0" (H0) in the westernmost and southern Labrador Sea. A detrital carbonate layer was observed in a high resolution marine sediment record from southern Newfoundland and the onset of the event was dated to 11,600 ± 70 cal. yrs. BP (local ΔR = 140 yrs.). A variety of different proxies was applied to investigate the transport mechanisms for deposition of the layer and provenance of the carbonates. Elevated concentrations of dolomite and calcite based on quantitative X-ray diffraction measurements, combined with the presence of several mature petrogenic biomarkers limit the source of the H0 detrital input to Palaeozoic carbonate outcrops in north-eastern Canada. The event is attributed to the rapid ice retreat from the Hudson Strait directly following the warming at the onset of the Holocene. Based on additional proxy data published earlier from the same record, the event succeeded the early Holocene resumption of the Atlantic Meridional Overturning Circulation (AMOC), indicating that the Hudson Strait meltwater event had probably no significant impact on the AMOC. The detrital carbonate layer can be found in other marine sediment records along the Labrador Current pathway, from Hudson Strait to the Grand Banks and the southern Newfoundland slope. By using the onset of deposition of the carbonates as a time-synchronous marker, the DC layer has great potential for improving marine chronologies of late glacial age in the region and evaluating spatial variations in ΔR values.
NASA Astrophysics Data System (ADS)
Hubert-Ferrari, Aurélia; El-Ouahabi, Meriam; Garcia-Moreno, David; Avsar, Ulas; Altinok, Sevgi; Schmidt, Sabine; Cagatay, Namik
2016-04-01
Delta contains a sedimentary record primarily indicative of water level changes, but particularly sensitive to earthquake shaking, which results generally in soft-sediment-deformation structures. The Kürk Delta adjacent to a major strike-slip fault displays this type of deformation (Hempton and Dewey, 1983) as well as other types of earthquake fingerprints that are specifically investigated. This lacustrine delta stands at the south-western extremity of the Hazar Lake and is bound by the East Anatolian Fault (EAF), which generated earthquakes of magnitude 7 in eastern Turkey. Water level changes and earthquake shaking affecting the Kurk Delta have been reevaluated combining geophysical data (seismic-reflection profiles and side-scan sonar), remote sensing images, historical data, onland outcrops and offshore coring. The history of water level changes provides a temporal framework regarding the sedimentological record. In addition to the commonly soft-sediment-deformation previously documented, the onland outcrops reveal a record of deformation (faults and clastic dykes) linked to large earthquake-induced liquefactions. The recurrent liquefaction structures can be used to obtain a paleoseismological record. Five event horizons were identified that could be linked to historical earthquakes occurring in the last 1000 years along the EAF. Sedimentary cores sampling the most recent subaqueous sedimentation revealed the occurrence of another type of earthquake fingerprint. Based on radionuclide dating (137Cs and 210Pb), two major sedimentary events were attributed to the 1874-1875 earthquake sequence along the EAF. Their sedimentological characteristics were inferred based X-ray imagery, XRD, LOI, grain-size distribution, geophysical measurements. The events are interpreted to be hyperpycnal deposits linked to post-seismic sediment reworking of earthquake-triggered landslides. A time constraint regarding this sediment remobilization process could be achieved thanks to the fact that the two studied sedimentary events are separated by less than one year.
NASA Astrophysics Data System (ADS)
Ettinger, N. P.; Martindale, R. C.; Kosir, A.; Thibodeau, A. M.
2016-12-01
Oceanic anoxic events (OAEs) have been shown to have an intimate influence on source rock deposition, marine extinctions, and the reorganization of carbonate factories throughout geologic time. Today, the possibility of environmental deterioration such as warming, acidification, and decreased oxygenation in modern oceans has increased the importance of ancient analogues. Therefore, studies of ancient rapid environmental change, such as the Toarcian Oceanic Anoxic Event, can inform our understanding of how marine ecosystems will respond to similar stresses in the future. The Toarcian OAE coincides with a marine mass extinction and the deposition of deep-water black shales; the putative cause of the OAE is the emplacement of the Karoo-Ferrar-Chon Aike Large Igneous Province. Although black shales are the hallmark of oceanic anoxic events, the contemporaneous shallow marine response to anoxia and other stresses is subtler and poorly documented by comparison. We will present a record of Pliensbachian-Toarcian aged shallow-water carbonates from the Dinaric Carbonate Platform in Slovenia. This platform provides a key record of the Toarcian OAE, as it is one of the few platforms from the Tethys Ocean that experienced nearly continuous sedimentation throughout the Pliensbachian and Toarcian as a result of tectonic quiescence. Sedimentological, geochemical, and paleontological data from two sections of the Trnovski Gozd karst plateau are used to assess the timing of volcanism and the response of biotic and abiotic carbonates to environmental changes associated with the OAE. Benthic forams, dasycladacean algae, and oncolitic packstones dominate diverse skeletal assemblages in the Pliensbachian record. The stage boundary coincides with anomalies in redox-sensitive elements, a hiatus in carbonate production represented by marine firmgrounds, and an anomalous increase in mercury content. The early Toarcian record is dominated by crinoidal-oolitic packstones and grainstones, with rare low-diversity skeletal assemblages. Sedimentological changes in oolite-dominated facies in the early Toarcian, along with geochemical and paleontological data, represent an observable response in the shallow-water carbonates of the Dinaric Carbonate Platform to Karoo-Ferrar-Chon Aike volcanism.
Late accretion to the Moon recorded in zircon (U-Th)/He thermochronometry
NASA Astrophysics Data System (ADS)
Kelly, Nigel M.; Flowers, Rebecca M.; Metcalf, James R.; Mojzsis, Stephen J.
2018-01-01
We conducted zircon (U-Th)/He (ZHe) analysis of lunar impact-melt breccia 14311 with the aim of leveraging radiation damage accumulated in zircon over extended intervals to detect low-temperature or short-lived impact events that have previously eluded traditional isotopic dating techniques. Our ZHe data record a coherent date vs. effective Uranium concentration (eU) trend characterized by >3500 Ma dates from low (≤75 ppm) eU zircon grains, and ca. 110 Ma dates for high (≥100 ppm) eU grains. A progression between these date populations is apparent for intermediate (75-100 ppm) eU grains. Thermal history modeling constrains permissible temperatures and cooling rates during and following impacts. Modeling shows that the data are most simply explained by impact events at ca. 3950 Ma and ca. 110 Ma, and limits allowable temperatures of heating events between 3950-110 Ma. Modeling of solar cycling thermal effects at the lunar surface precludes this as the explanation for the ca. 110 Ma ZHe dates. We propose a sample history characterized by zircon resetting during the ca. 3950 Ma Imbrium impact event, with subsequent heating during an impact at ca. 110 Ma that ejected the sample to the vicinity of its collection site. Our data show that zircon has the potential to retain 4He over immense timescales (≥3950 Myrs), thus providing a valuable new thermochronometer for probing the impact histories of lunar samples, and martian or asteroidal meteorites.
Observation of Celestial Phenomena in Ancient China
NASA Astrophysics Data System (ADS)
Sun, Xiaochun
Because of the need for calendar-making and portent astrology, the Chinese were diligent and meticulous observers of celestial phenomena. China has maintained the longest continuous historical records of celestial phenomena in the world. Extraordinary or abnormal celestial events were particularly noted because of their astrological significance. The historical records cover various types of celestial phenomena, which include solar and lunar eclipses, sunspots, "guest stars" (novae or supernovae as we understand today), comets and meteors, and all kinds of planetary phenomena. These records provide valuable historical data for astronomical studies today.
The ISC Seismic Event Bibliography
NASA Astrophysics Data System (ADS)
Di Giacomo, Domenico; Storchak, Dmitry
2015-04-01
The International Seismological Centre (ISC) is a not-for-profit organization operating in the UK for the last 50 years and producing the ISC Bulletin - the definitive worldwide summary of seismic events, both natural and anthropogenic - starting from the beginning of 20th century. Often researchers need to gather information related to specific seismic events for various reasons. To facilitate such task, in 2012 we set up a new database linking earthquakes and other seismic events in the ISC Bulletin to bibliographic records of scientific articles (mostly peer-reviewed journals) that describe those events. Such association allows users of the ISC Event Bibliography (www.isc.ac.uk/event_bibliography/index.php) to run searches for publications via a map-based web interface and, optionally, selecting scientific publications related to either specific events or events in the area of interest. Some of the greatest earthquakes were described in several hundreds of articles published over a period of few years. The journals included in our database are not limited to seismology but bring together a variety of fields in geosciences (e.g., engineering seismology, geodesy and remote sensing, tectonophysics, monitoring research, tsunami, geology, geochemistry, hydrogeology, atmospheric sciences, etc.) making this service useful in multidisciplinary studies. Usually papers dealing with large data set are not included (e.g., papers describing a seismic catalogue). Currently the ISC Event Bibliography includes over 17,000 individual publications from about 500 titles related to over 14,000 events that occurred in last 100+ years. The bibliographic records in the Event Bibliography start in the 1950s, and it is updated as new publications become available.
Northern Cascadia Subduction Zone Earthquake Records from Onshore and Offshore Core Data
NASA Astrophysics Data System (ADS)
Hausmann, R. B.; Goldfinger, C.; Black, B.; Romsos, C. G.; Galer, S.; Collins, T.
2016-12-01
We are investigating the paleoseismic record at Bull Run Lake, at the latitude of Portland, Oregon, central Cascadia margin. Bull Run is a landslide dammed lake in a cirque basin on the western flanks of Mt. Hood, 65 km east of Portland, and is the City of Portland's primary water supply. We collected full coverage high-resolution multibeam and backscatter data, high resolution CHIRP sub-bottom profiles, and seven sediment cores which contain a correlative turbidite sequence of post Mazama beds. The continuity of the turbidite record shows little or no relationship to the minor stream inlets, suggesting the disturbance beds are not likely to be storm related. CT and physical property data were used to separate major visible beds and background sedimentation, which also contain thin laminae. The XRF element Compton scattering may show grading due to mineralogical variation and a change in wave profile, commonly found at bed boundaries. We have identified 27 post -Mazama event beds and 5 ashes in the lake, and constructed an OxCal age model anchored by radiocarbon ages, the Mazama ash, and the twin Timberline ash beds. The radiocarbon ages, age model results, as well as electron microprobe (EMP) data clearly identify the Mazama ash at the base of our cores. Two closely-spaced ash beds in our cores likely correlate to the Timberline eruptive period at 1.5ka. The number, timing and sequence of the event beds, and physical property log correlation, as well as key bed characteristics, closely matches offshore turbidite sequences off northern Oregon. For example, key regional bed T11, observed as a thick two-pulse bed in all offshore cores, also anchors the Bull Run sequence. One difference is that the twin Timberline ash occupies the stratigraphic position of regional offshore paleoseismic bed T4, which is also a two pulse event at this latitude. The cores also contain many faint laminae that may contain a storm record, however, the identification of small beds is complicated by the low sedimentation rate and low resolution of the Bull Run cores. The watershed and lake may also contain evidence of crustal faulting, though the event sequence appears to be primarily that of the Cascadia subduction zone earthquake sequence. See also Goldfinger et al. for investigation of slope stability and ground motions at Bull Run and other Cascadia lakes.
The use of historical information for regional frequency analysis of extreme skew surge
NASA Astrophysics Data System (ADS)
Frau, Roberto; Andreewsky, Marc; Bernardara, Pietro
2018-03-01
The design of effective coastal protections requires an adequate estimation of the annual occurrence probability of rare events associated with a return period up to 103 years. Regional frequency analysis (RFA) has been proven to be an applicable way to estimate extreme events by sorting regional data into large and spatially distributed datasets. Nowadays, historical data are available to provide new insight on past event estimation. The utilisation of historical information would increase the precision and the reliability of regional extreme's quantile estimation. However, historical data are from significant extreme events that are not recorded by tide gauge. They usually look like isolated data and they are different from continuous data from systematic measurements of tide gauges. This makes the definition of the duration of our observations period complicated. However, the duration of the observation period is crucial for the frequency estimation of extreme occurrences. For this reason, we introduced here the concept of credible duration
. The proposed RFA method (hereinafter referenced as FAB, from the name of the authors) allows the use of historical data together with systematic data, which is a result of the use of the credible duration concept.
Mach-Zehnder interferometer-based recording system for WACO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woerner, R.
1988-06-01
EG and G Energy Measurements, Inc., Los Alamos Operations (LAO) designed and built a Mach-Zehnder-interferometer-based recording system to record low-bandwidth pulses. This work was undertaken at the request of the Los Alamos National Laboratory, P-14 Fast Transient Plasma Measurement group. The system was fielded on WACO and its performance compared with that of a conventional recording system fielded on the same event. The results of the fielding showed that for low bandwidth applications like the WACO experiment, the M-Z-based system provides the same data quality and dynamic range as the conventional oscilloscope system, but it is far less complex andmore » uses fewer recorders. 4 figs.« less
Mevik, Kjersti; Griffin, Frances A; Hansen, Tonje E; Deilkås, Ellen T; Vonen, Barthold
2016-04-25
To investigate the impact of increasing sample of records reviewed bi-weekly with the Global Trigger Tool method to identify adverse events in hospitalised patients. Retrospective observational study. A Norwegian 524-bed general hospital trust. 1920 medical records selected from 1 January to 31 December 2010. Rate, type and severity of adverse events identified in two different samples sizes of records selected as 10 and 70 records, bi-weekly. In the large sample, 1.45 (95% CI 1.07 to 1.97) times more adverse events per 1000 patient days (39.3 adverse events/1000 patient days) were identified than in the small sample (27.2 adverse events/1000 patient days). Hospital-acquired infections were the most common category of adverse events in both the samples, and the distributions of the other categories of adverse events did not differ significantly between the samples. The distribution of severity level of adverse events did not differ between the samples. The findings suggest that while the distribution of categories and severity are not dependent on the sample size, the rate of adverse events is. Further studies are needed to conclude if the optimal sample size may need to be adjusted based on the hospital size in order to detect a more accurate rate of adverse events. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Global Tsunami Database: Adding Geologic Deposits, Proxies, and Tools
NASA Astrophysics Data System (ADS)
Brocko, V. R.; Varner, J.
2007-12-01
A result of collaboration between NOAA's National Geophysical Data Center (NGDC) and the Cooperative Institute for Research in the Environmental Sciences (CIRES), the Global Tsunami Database includes instrumental records, human observations, and now, information inferred from the geologic record. Deep Ocean Assessment and Reporting of Tsunamis (DART) data, historical reports, and information gleaned from published tsunami deposit research build a multi-faceted view of tsunami hazards and their history around the world. Tsunami history provides clues to what might happen in the future, including frequency of occurrence and maximum wave heights. However, instrumental and written records commonly span too little time to reveal the full range of a region's tsunami hazard. The sedimentary deposits of tsunamis, identified with the aid of modern analogs, increasingly complement instrumental and human observations. By adding the component of tsunamis inferred from the geologic record, the Global Tsunami Database extends the record of tsunamis backward in time. Deposit locations, their estimated age and descriptions of the deposits themselves fill in the tsunami record. Tsunamis inferred from proxies, such as evidence for coseismic subsidence, are included to estimate recurrence intervals, but are flagged to highlight the absence of a physical deposit. Authors may submit their own descriptions and upload digital versions of publications. Users may sort by any populated field, including event, location, region, age of deposit, author, publication type (extract information from peer reviewed publications only, if you wish), grain size, composition, presence/absence of plant material. Users may find tsunami deposit references for a given location, event or author; search for particular properties of tsunami deposits; and even identify potential collaborators. Users may also download public-domain documents. Data and information may be viewed using tools designed to extract and display data from the Oracle database (selection forms, Web Map Services, and Web Feature Services). In addition, the historic tsunami archive (along with related earthquakes and volcanic eruptions) is available in KML (Keyhole Markup Language) format for use with Google Earth and similar geo-viewers.
Golder, Su; Norman, Gill; Loke, Yoon K
2015-01-01
Aim The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. Methods A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. Results From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for ‘symptom’ related and ‘mild’ adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Conclusions Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. PMID:26271492