Sample records for accurate event locations

  1. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  2. Location of long-period events below Kilauea Volcano using seismic amplitudes and accurate relative relocation

    USGS Publications Warehouse

    Battaglia, J.; Got, J.-L.; Okubo, P.

    2003-01-01

    We present methods for improving the location of long-period (LP) events, deep and shallow, recorded below Kilauea Volcano by the permanent seismic network. LP events might be of particular interest to understanding eruptive processes as their source mechanism is assumed to directly involve fluid transport. However, it is usually difficult or impossible to locate their source using traditional arrival time methods because of emergent wave arrivals. At Kilauea, similar LP waveform signatures suggest the existence of LP multiplets. The waveform similarity suggests spatially close sources, while catalog solutions using arrival time estimates are widely scattered beneath Kilauea's summit caldera. In order to improve estimates of absolute LP location, we use the distribution of seismic amplitudes corrected for station site effects. The decay of the amplitude as a function of hypocentral distance is used for inferring LP location. In a second stage, we use the similarity of the events to calculate their relative positions. The analysis of the entire LP seismicity recorded between January 1997 and December 1999 suggests that a very large part of the LP event population, both deep and shallow, is generated by a small number of compact sources. Deep events are systematically composed of a weak high-frequency onset followed by a low-frequency wave train. Aligning the low-frequency wave trains does not lead to aligning the onsets indicating the two parts of the signal are dissociated. This observation favors an interpretation in terms of triggering and resonance of a magmatic conduit. Instead of defining fault planes, the precise relocation of similar LP events, based on the alignment of the high-energy low-frequency wave trains, defines limited size volumes. Copyright 2003 by the American Geophysical Union.

  3. Accurate relative location estimates for the North Korean nuclear tests using empirical slowness corrections

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna, T.; Mykkeltveit, S.

    2017-01-01

    velocity gradients reduce the residuals, the relative location uncertainties and the sensitivity to the combination of stations used. The traveltime gradients appear to be overestimated for the regional phases, and teleseismic relative location estimates are likely to be more accurate despite an apparent lower precision. Calibrations for regional phases are essential given that smaller magnitude events are likely not to be recorded teleseismically. We discuss the implications for the absolute event locations. Placing the 2006 event under a local maximum of overburden at 41.293°N, 129.105°E would imply a location of 41.299°N, 129.075°E for the January 2016 event, providing almost optimal overburden for the later four events.

  4. Moment tensor and location of seismic events in the 2017 DPRK test

    NASA Astrophysics Data System (ADS)

    Wei, S.; Shi, Q.; Chen, Q. F.; Wang, T.

    2017-12-01

    The main seismic event in the 2017 DPRK test was followed by a secondary event about eight minutes later. We conducted waveform analysis on the regional broadband waveform data to better constrain the moment tensor and location of these two events, to further understand their relations. In the first place, we applied the generalized Cut-And-Paste (gCAP) method to the regional data to invert the full moment tensor solutions of the two events. Our long period (0.02-0.08 Hz for Pnl, 0.02-0.055 Hz for surface waves) inversions show that the main event was composed of large positive ISO component ( 90% of the total moment) and has a moment magnitude of 5.4. In contrast, the second event shows large negative ISO component ( 50% of the total moment) with a moment magnitude of 4.5. Although there are trade-offs between the CLVD and the ISO component for the second event, chiefly caused by the coda waves from the first event, the result is more robust if we force a small CVLD component in the inversion. We also relocated the epicenter of the second event using P-wave first arrival picks, relative to the location of the first event, which has been accurately determined from the high-resolution geodetic data. The calibration from the first event allows us to precisely locate the second event, which shows an almost identical location to the first event. After a polarity correction, their high-frequency ( 0.25 - 0.9 Hz) regional surface waves also display high similarity, supporting the similar location but opposite ISO polarity of the two events. Our results suggest that the second event was likely to be caused by the collapsing after the main event, in agreement with the surface displacement derived from geodetic observation and modeling results.

  5. Fine-Scale Event Location and Error Analysis in NET-VISA

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.

    2016-12-01

    NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.

  6. Lithospheric Models of the Middle East to Improve Seismic Source Parameter Determination/Event Location Accuracy

    DTIC Science & Technology

    2012-09-01

    State Award Nos. DE-AC52-07NA27344/24.2.3.2 and DOS_SIAA-11-AVC/NMA-1 ABSTRACT The Middle East is a tectonically complex and seismically...active region. The ability to accurately locate earthquakes and other seismic events in this region is complicated by tectonics , the uneven...and seismic source parameters show that this activity comes from tectonic events. This work is informed by continuous or event-based regional

  7. Station Correction Uncertainty in Multiple Event Location Algorithms and the Effect on Error Ellipses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Jason P.; Carlson, Deborah K.; Ortiz, Anne

    Accurate location of seismic events is crucial for nuclear explosion monitoring. There are several sources of error in seismic location that must be taken into account to obtain high confidence results. Most location techniques account for uncertainties in the phase arrival times (measurement error) and the bias of the velocity model (model error), but they do not account for the uncertainty of the velocity model bias. By determining and incorporating this uncertainty in the location algorithm we seek to improve the accuracy of the calculated locations and uncertainty ellipses. In order to correct for deficiencies in the velocity model, itmore » is necessary to apply station specific corrections to the predicted arrival times. Both master event and multiple event location techniques assume that the station corrections are known perfectly, when in reality there is an uncertainty associated with these corrections. For multiple event location algorithms that calculate station corrections as part of the inversion, it is possible to determine the variance of the corrections. The variance can then be used to weight the arrivals associated with each station, thereby giving more influence to stations with consistent corrections. We have modified an existing multiple event location program (based on PMEL, Pavlis and Booker, 1983). We are exploring weighting arrivals with the inverse of the station correction standard deviation as well using the conditional probability of the calculated station corrections. This is in addition to the weighting already given to the measurement and modeling error terms. We re-locate a group of mining explosions that occurred at Black Thunder, Wyoming, and compare the results to those generated without accounting for station correction uncertainty.« less

  8. Ground truth seismic events and location capability at Degelen mountain, Kazakhstan

    NASA Astrophysics Data System (ADS)

    Trabant, Chad; Thurber, Clifford; Leith, William

    2002-07-01

    We utilized nuclear explosions from the Degelen Mountain sub-region of the Semipalatinsk Test Site (STS), Kazakhstan, to assess seismic location capability directly. Excellent ground truth information for these events was either known or was estimated from maps of the Degelen Mountain adit complex. Origin times were refined for events for which absolute origin time information was unknown using catalog arrival times, our ground truth location estimates, and a time baseline provided by fixing known origin times during a joint hypocenter determination (JHD). Precise arrival time picks were determined using a waveform cross-correlation process applied to the available digital data. These data were used in a JHD analysis. We found that very accurate locations were possible when high precision, waveform cross-correlation arrival times were combined with JHD. Relocation with our full digital data set resulted in a mean mislocation of 2 km and a mean 95% confidence ellipse (CE) area of 6.6 km 2 (90% CE: 5.1 km 2), however, only 5 of the 18 computed error ellipses actually covered the associated ground truth location estimate. To test a more realistic nuclear test monitoring scenario, we applied our JHD analysis to a set of seven events (one fixed) using data only from seismic stations within 40° epicentral distance. Relocation with these data resulted in a mean mislocation of 7.4 km, with four of the 95% error ellipses covering less than 570 km 2 (90% CE: 438 km 2), and the other two covering 1730 and 8869 km 2 (90% CE: 1331 and 6822 km 2). Location uncertainties calculated using JHD often underestimated the true error, but a circular region with a radius equal to the mislocation covered less than 1000 km 2 for all events having more than three observations.

  9. Ground truth seismic events and location capability at Degelen mountain, Kazakhstan

    USGS Publications Warehouse

    Trabant, C.; Thurber, C.; Leith, W.

    2002-01-01

    We utilized nuclear explosions from the Degelen Mountain sub-region of the Semipalatinsk Test Site (STS), Kazakhstan, to assess seismic location capability directly. Excellent ground truth information for these events was either known or was estimated from maps of the Degelen Mountain adit complex. Origin times were refined for events for which absolute origin time information was unknown using catalog arrival times, our ground truth location estimates, and a time baseline provided by fixing known origin times during a joint hypocenter determination (JHD). Precise arrival time picks were determined using a waveform cross-correlation process applied to the available digital data. These data were used in a JHD analysis. We found that very accurate locations were possible when high precision, waveform cross-correlation arrival times were combined with JHD. Relocation with our full digital data set resulted in a mean mislocation of 2 km and a mean 95% confidence ellipse (CE) area of 6.6 km2 (90% CE: 5.1 km2), however, only 5 of the 18 computed error ellipses actually covered the associated ground truth location estimate. To test a more realistic nuclear test monitoring scenario, we applied our JHD analysis to a set of seven events (one fixed) using data only from seismic stations within 40?? epicentral distance. Relocation with these data resulted in a mean mislocation of 7.4 km, with four of the 95% error ellipses covering less than 570 km2 (90% CE: 438 km2), and the other two covering 1730 and 8869 km2 (90% CE: 1331 and 6822 km2). Location uncertainties calculated using JHD often underestimated the true error, but a circular region with a radius equal to the mislocation covered less than 1000 km2 for all events having more than three observations. ?? 2002 Elsevier Science B.V. All rights reserved.

  10. Processing ser and estar to locate objects and events

    PubMed Central

    Dussias, Paola E.; Contemori, Carla; Román, Patricia

    2016-01-01

    In Spanish locative constructions, a different form of the copula is selected in relation to the semantic properties of the grammatical subject: sentences that locate objects require estar while those that locate events require ser (both translated in English as ‘to be’). In an ERP study, we examined whether second language (L2) speakers of Spanish are sensitive to the selectional restrictions that the different types of subjects impose on the choice of the two copulas. Twenty-four native speakers of Spanish and two groups of L2 Spanish speakers (24 beginners and 18 advanced speakers) were recruited to investigate the processing of ‘object/event + estar/ser’ permutations. Participants provided grammaticality judgments on correct (object + estar; event + ser) and incorrect (object + ser; event + estar) sentences while their brain activity was recorded. In line with previous studies (Leone-Fernández, Molinaro, Carreiras, & Barber, 2012; Sera, Gathje, & Pintado, 1999), the results of the grammaticality judgment for the native speakers showed that participants correctly accepted object + estar and event + ser constructions. In addition, while ‘object + ser’ constructions were considered grossly ungrammatical, ‘event + estar’ combinations were perceived as unacceptable to a lesser degree. For these same participants, ERP recording time-locked to the onset of the critical word ‘en’ showed a larger P600 for the ser predicates when the subject was an object than when it was an event (*La silla es en la cocina vs. La fiesta es en la cocina). This P600 effect is consistent with syntactic repair of the defining predicate when it does not fit with the adequate semantic properties of the subject. For estar predicates (La silla está en la cocina vs. *La fiesta está en la cocina), the findings showed a central-frontal negativity between 500–700 ms. Grammaticality judgment data for the L2 speakers of Spanish showed that beginners were significantly less

  11. Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion

    NASA Astrophysics Data System (ADS)

    Witten, B.; Shragge, J. C.

    2016-12-01

    The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.

  12. Microseismic event location by master-event waveform stacking

    NASA Astrophysics Data System (ADS)

    Grigoli, F.; Cesca, S.; Dahm, T.

    2016-12-01

    Waveform stacking location methods are nowadays extensively used to monitor induced seismicity monitoring assoiciated with several underground industrial activities such as Mining, Oil&Gas production and Geothermal energy exploitation. In the last decade a significant effort has been spent to develop or improve methodologies able to perform automated seismological analysis for weak events at a local scale. This effort was accompanied by the improvement of monitoring systems, resulting in an increasing number of large microseismicity catalogs. The analysis of microseismicity is challenging, because of the large number of recorded events often characterized by a low signal-to-noise ratio. A significant limitation of the traditional location approaches is that automated picking is often done on each seismogram individually, making little or no use of the coherency information between stations. In order to improve the performance of the traditional location methods, in the last year, alternative approaches have been proposed. These methods exploits the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. The main advantage of this methods relies on their robustness even when the recorded waveforms are very noisy. On the other hand, like any other location method, the location performance strongly depends on the accuracy of the available velocity model. When dealing with inaccurate velocity models, in fact, location results can be affected by large errors. Here we will introduce a new automated waveform stacking location method which is less dependent on the knowledge of the velocity model and presents several benefits, which improve the location accuracy: 1) it accounts for phase delays due to local site effects, e.g. surface topography or variable sediment thickness 2) theoretical velocity model are only used to estimate travel times within the source volume, and not along the whole source-sensor path. We

  13. Hydrogen atoms can be located accurately and precisely by x-ray crystallography.

    PubMed

    Woińska, Magdalena; Grabowsky, Simon; Dominiak, Paulina M; Woźniak, Krzysztof; Jayatilaka, Dylan

    2016-05-01

    Precise and accurate structural information on hydrogen atoms is crucial to the study of energies of interactions important for crystal engineering, materials science, medicine, and pharmacy, and to the estimation of physical and chemical properties in solids. However, hydrogen atoms only scatter x-radiation weakly, so x-rays have not been used routinely to locate them accurately. Textbooks and teaching classes still emphasize that hydrogen atoms cannot be located with x-rays close to heavy elements; instead, neutron diffraction is needed. We show that, contrary to widespread expectation, hydrogen atoms can be located very accurately using x-ray diffraction, yielding bond lengths involving hydrogen atoms (A-H) that are in agreement with results from neutron diffraction mostly within a single standard deviation. The precision of the determination is also comparable between x-ray and neutron diffraction results. This has been achieved at resolutions as low as 0.8 Å using Hirshfeld atom refinement (HAR). We have applied HAR to 81 crystal structures of organic molecules and compared the A-H bond lengths with those from neutron measurements for A-H bonds sorted into bonds of the same class. We further show in a selection of inorganic compounds that hydrogen atoms can be located in bridging positions and close to heavy transition metals accurately and precisely. We anticipate that, in the future, conventional x-radiation sources at in-house diffractometers can be used routinely for locating hydrogen atoms in small molecules accurately instead of large-scale facilities such as spallation sources or nuclear reactors.

  14. Hydrogen atoms can be located accurately and precisely by x-ray crystallography

    PubMed Central

    Woińska, Magdalena; Grabowsky, Simon; Dominiak, Paulina M.; Woźniak, Krzysztof; Jayatilaka, Dylan

    2016-01-01

    Precise and accurate structural information on hydrogen atoms is crucial to the study of energies of interactions important for crystal engineering, materials science, medicine, and pharmacy, and to the estimation of physical and chemical properties in solids. However, hydrogen atoms only scatter x-radiation weakly, so x-rays have not been used routinely to locate them accurately. Textbooks and teaching classes still emphasize that hydrogen atoms cannot be located with x-rays close to heavy elements; instead, neutron diffraction is needed. We show that, contrary to widespread expectation, hydrogen atoms can be located very accurately using x-ray diffraction, yielding bond lengths involving hydrogen atoms (A–H) that are in agreement with results from neutron diffraction mostly within a single standard deviation. The precision of the determination is also comparable between x-ray and neutron diffraction results. This has been achieved at resolutions as low as 0.8 Å using Hirshfeld atom refinement (HAR). We have applied HAR to 81 crystal structures of organic molecules and compared the A–H bond lengths with those from neutron measurements for A–H bonds sorted into bonds of the same class. We further show in a selection of inorganic compounds that hydrogen atoms can be located in bridging positions and close to heavy transition metals accurately and precisely. We anticipate that, in the future, conventional x-radiation sources at in-house diffractometers can be used routinely for locating hydrogen atoms in small molecules accurately instead of large-scale facilities such as spallation sources or nuclear reactors. PMID:27386545

  15. It's All about Location, Location, Location: Children's Memory for the "Where'' of Personally Experienced Events

    ERIC Educational Resources Information Center

    Bauer, Patricia J.; Doydum, Ayzit O.; Pathman, Thanujeni; Larkina, Marina; Guler, O. Evren; Burch, Melissa

    2012-01-01

    Episodic memory is defined as the ability to recall specific past events located in a particular time and place. Over the preschool and into the school years, there are clear developmental changes in memory for when events took place. In contrast, little is known about developmental changes in memory for where events were experienced. In the…

  16. Improving Infrasound Signal Detection and Event Location in the Western US Using Atmospheric Modeling

    NASA Astrophysics Data System (ADS)

    Dannemann, F. K.; Park, J.; Marcillo, O. E.; Blom, P. S.; Stump, B. W.; Hayward, C.

    2016-12-01

    Data from five infrasound arrays in the western US jointly operated by University of Utah Seismograph Station and Southern Methodist University are used to test a database-centric processing pipeline, InfraPy, for automated event detection, association and location. Infrasonic array data from a one-year time period (January 1 2012 to December 31 2012) are used. This study focuses on the identification and location of 53 ground-truth verified events produced from near surface military explosions at the Utah Test and Training Range (UTTR). Signals are detected using an adaptive F-detector, which accounts for correlated and uncorrelated time-varying noise in order to reduce false detections due to the presence of coherent noise. Variations in detection azimuth and correlation are found to be consistent with seasonal changes in atmospheric winds. The Bayesian infrasonic source location (BISL) method is used to produce source location and time credibility contours based on posterior probability density functions. Updates to the previous BISL methodology include the application of celerity range and azimuth deviation distributions in order to accurately account for the spatial and temporal variability of infrasound propagation through the atmosphere. These priors are estimated by ray tracing through Ground-to-Space (G2S) atmospheric models as a function of season and time of day using historic atmospheric characterizations from 2007 to 2013. Out of the 53 events, 31 are successfully located using the InfraPy pipeline. Confidence contour areas for maximum a posteriori event locations produce error estimates which are reduced a maximum of 98% and an average of 25% from location estimates utilizing a simple time independent uniform atmosphere. We compare real-time ray tracing results with the statistical atmospheric priors used in this study to examine large time differences between known origin times and estimated origin times that might be due to the misidentification of

  17. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGES

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; ...

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  18. Multiple-Event Seismic Location Using the Markov-Chain Monte Carlo Technique

    NASA Astrophysics Data System (ADS)

    Myers, S. C.; Johannesson, G.; Hanley, W.

    2005-12-01

    We develop a new multiple-event location algorithm (MCMCloc) that utilizes the Markov-Chain Monte Carlo (MCMC) method. Unlike most inverse methods, the MCMC approach produces a suite of solutions, each of which is consistent with observations and prior estimates of data and model uncertainties. Model parameters in MCMCloc consist of event hypocenters, and travel-time predictions. Data are arrival time measurements and phase assignments. Posteriori estimates of event locations, path corrections, pick errors, and phase assignments are made through analysis of the posteriori suite of acceptable solutions. Prior uncertainty estimates include correlations between travel-time predictions, correlations between measurement errors, the probability of misidentifying one phase for another, and the probability of spurious data. Inclusion of prior constraints on location accuracy allows direct utilization of ground-truth locations or well-constrained location parameters (e.g. from InSAR) that aid in the accuracy of the solution. Implementation of a correlation structure for travel-time predictions allows MCMCloc to operate over arbitrarily large geographic areas. Transition in behavior between a multiple-event locator for tightly clustered events and a single-event locator for solitary events is controlled by the spatial correlation of travel-time predictions. We test the MCMC locator on a regional data set of Nevada Test Site nuclear explosions. Event locations and origin times are known for these events, allowing us to test the features of MCMCloc using a high-quality ground truth data set. Preliminary tests suggest that MCMCloc provides excellent relative locations, often outperforming traditional multiple-event location algorithms, and excellent absolute locations are attained when constraints from one or more ground truth event are included. When phase assignments are switched, we find that MCMCloc properly corrects the error when predicted arrival times are separated by

  19. A precise and accurate acupoint location obtained on the face using consistency matrix pointwise fusion method.

    PubMed

    Yanq, Xuming; Ye, Yijun; Xia, Yong; Wei, Xuanzhong; Wang, Zheyu; Ni, Hongmei; Zhu, Ying; Xu, Lingyu

    2015-02-01

    To develop a more precise and accurate method, and identified a procedure to measure whether an acupoint had been correctly located. On the face, we used an acupoint location from different acupuncture experts and obtained the most precise and accurate values of acupoint location based on the consistency information fusion algorithm, through a virtual simulation of the facial orientation coordinate system. Because of inconsistencies in each acupuncture expert's original data, the system error the general weight calculation. First, we corrected each expert of acupoint location system error itself, to obtain a rational quantification for each expert of acupuncture and moxibustion acupoint location consistent support degree, to obtain pointwise variable precision fusion results, to put every expert's acupuncture acupoint location fusion error enhanced to pointwise variable precision. Then, we more effectively used the measured characteristics of different acupuncture expert's acupoint location, to improve the measurement information utilization efficiency and acupuncture acupoint location precision and accuracy. Based on using the consistency matrix pointwise fusion method on the acupuncture experts' acupoint location values, each expert's acupoint location information could be calculated, and the most precise and accurate values of each expert's acupoint location could be obtained.

  20. Detecting and Locating Seismic Events Without Phase Picks or Velocity Models

    NASA Astrophysics Data System (ADS)

    Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.

    2015-12-01

    The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.

  1. Optimizing the real-time automatic location of the events produced in Romania using an advanced processing system

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Grecu, Bogdan; Manea, Liviu

    2016-04-01

    National Institute for Earth Physics (NIEP) operates a real time seismic network which is designed to monitor the seismic activity on the Romanian territory, which is dominated by the intermediate earthquakes (60-200 km) from Vrancea area. The ability to reduce the impact of earthquakes on society depends on the existence of a large number of high-quality observational data. The development of the network in recent years and an advanced seismic acquisition are crucial to achieving this objective. The software package used to perform the automatic real-time locations is Seiscomp3. An accurate choice of the Seiscomp3 setting parameters is necessary to ensure the best performance of the real-time system i.e., the most accurate location for the earthquakes and avoiding any false events. The aim of this study is to optimize the algorithms of the real-time system that detect and locate the earthquakes in the monitored area. This goal is pursued by testing different parameters (e.g., STA/LTA, filters applied to the waveforms) on a data set of representative earthquakes of the local seismicity. The results are compared with the locations from the Romanian Catalogue ROMPLUS.

  2. Multi-Detection Events, Probability Density Functions, and Reduced Location Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Schrom, Brian T.

    2016-03-01

    Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.

  3. Locations and focal mechanisms of deep long period events beneath Aleutian Arc volcanoes using back projection methods

    NASA Astrophysics Data System (ADS)

    Lough, A. C.; Roman, D. C.; Haney, M. M.

    2015-12-01

    Deep long period (DLP) earthquakes are commonly observed in volcanic settings such as the Aleutian Arc in Alaska. DLPs are poorly understood but are thought to be associated with movements of fluids, such as magma or hydrothermal fluids, deep in the volcanic plumbing system. These events have been recognized for several decades but few studies have gone beyond their identification and location. All long period events are more difficult to identify and locate than volcano-tectonic (VT) earthquakes because traditional detection schemes focus on high frequency (short period) energy. In addition, DLPs present analytical challenges because they tend to be emergent and so it is difficult to accurately pick the onset of arriving body waves. We now expect to find DLPs at most volcanic centers, the challenge lies in identification and location. We aim to reduce the element of human error in location by applying back projection to better constrain the depth and horizontal position of these events. Power et al. (2004) provided the first compilation of DLP activity in the Aleutian Arc. This study focuses on the reanalysis of 162 cataloged DLPs beneath 11 volcanoes in the Aleutian arc (we expect to ultimately identify and reanalyze more DLPs). We are currently adapting the approach of Haney (2014) for volcanic tremor to use back projection over a 4D grid to determine position and origin time of DLPs. This method holds great potential in that it will allow automated, high-accuracy picking of arrival times and could reduce the number of arrival time picks necessary for traditional location schemes to well constrain event origins. Back projection can also calculate a relative focal mechanism (difficult with traditional methods due to the emergent nature of DLPs) allowing the first in depth analysis of source properties. Our event catalog (spanning over 25 years and volcanoes) is one of the longest and largest and enables us to investigate spatial and temporal variation in DLPs.

  4. Joint probabilistic determination of earthquake location and velocity structure: application to local and regional events

    NASA Astrophysics Data System (ADS)

    Beucler, E.; Haugmard, M.; Mocquet, A.

    2016-12-01

    The most widely used inversion schemes to locate earthquakes are based on iterative linearized least-squares algorithms and using an a priori knowledge of the propagation medium. When a small amount of observations is available for moderate events for instance, these methods may lead to large trade-offs between outputs and both the velocity model and the initial set of hypocentral parameters. We present a joint structure-source determination approach using Bayesian inferences. Monte-Carlo continuous samplings, using Markov chains, generate models within a broad range of parameters, distributed according to the unknown posterior distributions. The non-linear exploration of both the seismic structure (velocity and thickness) and the source parameters relies on a fast forward problem using 1-D travel time computations. The a posteriori covariances between parameters (hypocentre depth, origin time and seismic structure among others) are computed and explicitly documented. This method manages to decrease the influence of the surrounding seismic network geometry (sparse and/or azimuthally inhomogeneous) and a too constrained velocity structure by inferring realistic distributions on hypocentral parameters. Our algorithm is successfully used to accurately locate events of the Armorican Massif (western France), which is characterized by moderate and apparently diffuse local seismicity.

  5. Classification of event location using matched filters via on-floor accelerometers

    NASA Astrophysics Data System (ADS)

    Woolard, Americo G.; Malladi, V. V. N. Sriram; Alajlouni, Sa'ed; Tarazaga, Pablo A.

    2017-04-01

    Recent years have shown prolific advancements in smart infrastructures, allowing buildings of the modern world to interact with their occupants. One of the sought-after attributes of smart buildings is the ability to provide unobtrusive, indoor localization of occupants. The ability to locate occupants indoors can provide a broad range of benefits in areas such as security, emergency response, and resource management. Recent research has shown promising results in occupant building localization, although there is still significant room for improvement. This study presents a passive, small-scale localization system using accelerometers placed around the edges of a small area in an active building environment. The area is discretized into a grid of small squares, and vibration measurements are processed using a pattern matching approach that estimates the location of the source. Vibration measurements are produced with ball-drops, hammer-strikes, and footsteps as the sources of the floor excitation. The developed approach uses matched filters based on a reference data set, and the location is classified using a nearest-neighbor search. This approach detects the appropriate location of impact-like sources i.e. the ball-drops and hammer-strikes with a 100% accuracy. However, this accuracy reduces to 56% for footsteps, with the average localization results being within 0.6 m (α = 0.05) from the true source location. While requiring a reference data set can make this method difficult to implement on a large scale, it may be used to provide accurate localization abilities in areas where training data is readily obtainable. This exploratory work seeks to examine the feasibility of the matched filter and nearest neighbor search approach for footstep and event localization in a small, instrumented area within a multi-story building.

  6. Considerations in Phase Estimation and Event Location Using Small-aperture Regional Seismic Arrays

    NASA Astrophysics Data System (ADS)

    Gibbons, Steven J.; Kværna, Tormod; Ringdal, Frode

    2010-05-01

    The global monitoring of earthquakes and explosions at decreasing magnitudes necessitates the fully automatic detection, location and classification of an ever increasing number of seismic events. Many seismic stations of the International Monitoring System are small-aperture arrays designed to optimize the detection and measurement of regional phases. Collaboration with operators of mines within regional distances of the ARCES array, together with waveform correlation techniques, has provided an unparalleled opportunity to assess the ability of a small-aperture array to provide robust and accurate direction and slowness estimates for phase arrivals resulting from well-constrained events at sites of repeating seismicity. A significant reason for the inaccuracy of current fully-automatic event location estimates is the use of f- k slowness estimates measured in variable frequency bands. The variability of slowness and azimuth measurements for a given phase from a given source region is reduced by the application of almost any constant frequency band. However, the frequency band resulting in the most stable estimates varies greatly from site to site. Situations are observed in which regional P- arrivals from two sites, far closer than the theoretical resolution of the array, result in highly distinct populations in slowness space. This means that the f- k estimates, even at relatively low frequencies, can be sensitive to source and path-specific characteristics of the wavefield and should be treated with caution when inferring a geographical backazimuth under the assumption of a planar wavefront arriving along the great-circle path. Moreover, different frequency bands are associated with different biases meaning that slowness and azimuth station corrections (commonly denoted SASCs) cannot be calibrated, and should not be used, without reference to the frequency band employed. We demonstrate an example where fully-automatic locations based on a source-region specific

  7. Distributed fiber sensing system with wide frequency response and accurate location

    NASA Astrophysics Data System (ADS)

    Shi, Yi; Feng, Hao; Zeng, Zhoumo

    2016-02-01

    A distributed fiber sensing system merging Mach-Zehnder interferometer and phase-sensitive optical time domain reflectometer (Φ-OTDR) is demonstrated for vibration measurement, which requires wide frequency response and accurate location. Two narrow line-width lasers with delicately different wavelengths are used to constitute the interferometer and reflectometer respectively. A narrow band Fiber Bragg Grating is responsible for separating the two wavelengths. In addition, heterodyne detection is applied to maintain the signal to noise rate of the locating signal. Experiment results show that the novel system has a wide frequency from 1 Hz to 50 MHz, limited by the sample frequency of data acquisition card, and a spatial resolution of 20 m, according to 200 ns pulse width, along 2.5 km fiber link.

  8. Dynamic sensing model for accurate delectability of environmental phenomena using event wireless sensor network

    NASA Astrophysics Data System (ADS)

    Missif, Lial Raja; Kadhum, Mohammad M.

    2017-09-01

    Wireless Sensor Network (WSN) has been widely used for monitoring where sensors are deployed to operate independently to sense abnormal phenomena. Most of the proposed environmental monitoring systems are designed based on a predetermined sensing range which does not reflect the sensor reliability, event characteristics, and the environment conditions. Measuring of the capability of a sensor node to accurately detect an event within a sensing field is of great important for monitoring applications. This paper presents an efficient mechanism for even detection based on probabilistic sensing model. Different models have been presented theoretically in this paper to examine their adaptability and applicability to the real environment applications. The numerical results of the experimental evaluation have showed that the probabilistic sensing model provides accurate observation and delectability of an event, and it can be utilized for different environment scenarios.

  9. Development of an accurate transmission line fault locator using the global positioning system satellites

    NASA Technical Reports Server (NTRS)

    Lee, Harry

    1994-01-01

    A highly accurate transmission line fault locator based on the traveling-wave principle was developed and successfully operated within B.C. Hydro. A transmission line fault produces a fast-risetime traveling wave at the fault point which propagates along the transmission line. This fault locator system consists of traveling wave detectors located at key substations which detect and time tag the leading edge of the fault-generated traveling wave as if passes through. A master station gathers the time-tagged information from the remote detectors and determines the location of the fault. Precise time is a key element to the success of this system. This fault locator system derives its timing from the Global Positioning System (GPS) satellites. System tests confirmed the accuracy of locating faults to within the design objective of +/-300 meters.

  10. A robust recognition and accurate locating method for circular coded diagonal target

    NASA Astrophysics Data System (ADS)

    Bao, Yunna; Shang, Yang; Sun, Xiaoliang; Zhou, Jiexin

    2017-10-01

    As a category of special control points which can be automatically identified, artificial coded targets have been widely developed in the field of computer vision, photogrammetry, augmented reality, etc. In this paper, a new circular coded target designed by RockeTech technology Corp. Ltd is analyzed and studied, which is called circular coded diagonal target (CCDT). A novel detection and recognition method with good robustness is proposed in the paper, and implemented on Visual Studio. In this algorithm, firstly, the ellipse features of the center circle are used for rough positioning. Then, according to the characteristics of the center diagonal target, a circular frequency filter is designed to choose the correct center circle and eliminates non-target noise. The precise positioning of the coded target is done by the correlation coefficient fitting extreme value method. Finally, the coded target recognition is achieved by decoding the binary sequence in the outer ring of the extracted target. To test the proposed algorithm, this paper has carried out simulation experiments and real experiments. The results show that the CCDT recognition and accurate locating method proposed in this paper can robustly recognize and accurately locate the targets in complex and noisy background.

  11. Memory development in the second year: for events or locations?

    PubMed

    Russell, James; Thompson, Doreen

    2003-04-01

    We employed an object-placement/object-removal design, inspired by recent work on 'episodic-like' memory in scrub jays (Clayton, N. S., & Dickinson, A. (1998). Episodic-like memory during cache recovery by scrub jays. Nature, 395, 272-274), to examine the possibility that children in the second year of life have event-based memories. In one task, a successful search could have been due to the recall of an object-removal event. In the second task, a successful search could only have been caused by recall of where objects were located. Success was general in the oldest group of children (21-25 months), while performance was broadly similar on the two tasks. The parsimonious interpretation of this outcome is that the first task was performed by location memory, not by event memory. We place these data in the context of object permanence development.

  12. Propagation of the velocity model uncertainties to the seismic event location

    NASA Astrophysics Data System (ADS)

    Gesret, A.; Desassis, N.; Noble, M.; Romary, T.; Maisons, C.

    2015-01-01

    Earthquake hypocentre locations are crucial in many domains of application (academic and industrial) as seismic event location maps are commonly used to delineate faults or fractures. The interpretation of these maps depends on location accuracy and on the reliability of the associated uncertainties. The largest contribution to location and uncertainty errors is due to the fact that the velocity model errors are usually not correctly taken into account. We propose a new Bayesian formulation that integrates properly the knowledge on the velocity model into the formulation of the probabilistic earthquake location. In this work, the velocity model uncertainties are first estimated with a Bayesian tomography of active shot data. We implement a sampling Monte Carlo type algorithm to generate velocity models distributed according to the posterior distribution. In a second step, we propagate the velocity model uncertainties to the seismic event location in a probabilistic framework. This enables to obtain more reliable hypocentre locations as well as their associated uncertainties accounting for picking and velocity model uncertainties. We illustrate the tomography results and the gain in accuracy of earthquake location for two synthetic examples and one real data case study in the context of induced microseismicity.

  13. Structural monitoring for rare events in remote locations

    NASA Astrophysics Data System (ADS)

    Hale, J. M.

    2005-01-01

    A structural monitoring system has been developed for use on high value engineering structures, which is particularly suitable for use in remote locations where rare events such as accidental impacts, seismic activity or terrorist attack might otherwise go undetected. The system comprises a low power intelligent on-site data logger and a remote analysis computer that communicate with one another using the internet and mobile telephone technology. The analysis computer also generates e-mail alarms and maintains a web page that displays detected events in near real-time to authorised users. The application of the prototype system to pipeline monitoring is described in which the analysis of detected events is used to differentiate between impacts and pressure surges. The system has been demonstrated successfully and is ready for deployment.

  14. Accurate Vehicle Location System Using RFID, an Internet of Things Approach.

    PubMed

    Prinsloo, Jaco; Malekian, Reza

    2016-06-04

    Modern infrastructure, such as dense urban areas and underground tunnels, can effectively block all GPS signals, which implies that effective position triangulation will not be achieved. The main problem that is addressed in this project is the design and implementation of an accurate vehicle location system using radio-frequency identification (RFID) technology in combination with GPS and the Global system for Mobile communication (GSM) technology, in order to provide a solution to the limitation discussed above. In essence, autonomous vehicle tracking will be facilitated with the use of RFID technology where GPS signals are non-existent. The design of the system and the results are reflected in this paper. An extensive literature study was done on the field known as the Internet of Things, as well as various topics that covered the integration of independent technology in order to address a specific challenge. The proposed system is then designed and implemented. An RFID transponder was successfully designed and a read range of approximately 31 cm was obtained in the low frequency communication range (125 kHz to 134 kHz). The proposed system was designed, implemented, and field tested and it was found that a vehicle could be accurately located and tracked. It is also found that the antenna size of both the RFID reader unit and RFID transponder plays a critical role in the maximum communication range that can be achieved.

  15. Accurate Vehicle Location System Using RFID, an Internet of Things Approach

    PubMed Central

    Prinsloo, Jaco; Malekian, Reza

    2016-01-01

    Modern infrastructure, such as dense urban areas and underground tunnels, can effectively block all GPS signals, which implies that effective position triangulation will not be achieved. The main problem that is addressed in this project is the design and implementation of an accurate vehicle location system using radio-frequency identification (RFID) technology in combination with GPS and the Global system for Mobile communication (GSM) technology, in order to provide a solution to the limitation discussed above. In essence, autonomous vehicle tracking will be facilitated with the use of RFID technology where GPS signals are non-existent. The design of the system and the results are reflected in this paper. An extensive literature study was done on the field known as the Internet of Things, as well as various topics that covered the integration of independent technology in order to address a specific challenge. The proposed system is then designed and implemented. An RFID transponder was successfully designed and a read range of approximately 31 cm was obtained in the low frequency communication range (125 kHz to 134 kHz). The proposed system was designed, implemented, and field tested and it was found that a vehicle could be accurately located and tracked. It is also found that the antenna size of both the RFID reader unit and RFID transponder plays a critical role in the maximum communication range that can be achieved. PMID:27271638

  16. Towards an accurate real-time locator of infrasonic sources

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Blom, P.; Polozov, A.; Marcillo, O.; Arrowsmith, S.; Hofstetter, A.

    2017-11-01

    Infrasonic signals propagate from an atmospheric source via media with stochastic and fast space-varying conditions. Hence, their travel time, the amplitude at sensor recordings and even manifestation in the so-called "shadow zones" are random. Therefore, the traditional least-squares technique for locating infrasonic sources is often not effective, and the problem for the best solution must be formulated in probabilistic terms. Recently, a series of papers has been published about Bayesian Infrasonic Source Localization (BISL) method based on the computation of the posterior probability density function (PPDF) of the source location, as a convolution of a priori probability distribution function (APDF) of the propagation model parameters with likelihood function (LF) of observations. The present study is devoted to the further development of BISL for higher accuracy and stability of the source location results and decreasing of computational load. We critically analyse previous algorithms and propose several new ones. First of all, we describe the general PPDF formulation and demonstrate that this relatively slow algorithm might be among the most accurate algorithms, provided the adequate APDF and LF are used. Then, we suggest using summation instead of integration in a general PPDF calculation for increased robustness, but this leads us to the 3D space-time optimization problem. Two different forms of APDF approximation are considered and applied for the PPDF calculation in our study. One of them is previously suggested, but not yet properly used is the so-called "celerity-range histograms" (CRHs). Another is the outcome from previous findings of linear mean travel time for the four first infrasonic phases in the overlapping consecutive distance ranges. This stochastic model is extended here to the regional distance of 1000 km, and the APDF introduced is the probabilistic form of the junction between this travel time model and range-dependent probability

  17. A General Event Location Algorithm with Applications to Eclipse and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  18. A General Event Location Algorithm with Applications to Eclispe and Station Line-of-Sight

    NASA Technical Reports Server (NTRS)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  19. LLNL Location and Detection Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S C; Harris, D B; Anderson, M L

    2003-07-16

    We present two LLNL research projects in the topical areas of location and detection. The first project assesses epicenter accuracy using a multiple-event location algorithm, and the second project employs waveform subspace Correlation to detect and identify events at Fennoscandian mines. Accurately located seismic events are the bases of location calibration. A well-characterized set of calibration events enables new Earth model development, empirical calibration, and validation of models. In a recent study, Bondar et al. (2003) develop network coverage criteria for assessing the accuracy of event locations that are determined using single-event, linearized inversion methods. These criteria are conservative andmore » are meant for application to large bulletins where emphasis is on catalog completeness and any given event location may be improved through detailed analysis or application of advanced algorithms. Relative event location techniques are touted as advancements that may improve absolute location accuracy by (1) ensuring an internally consistent dataset, (2) constraining a subset of events to known locations, and (3) taking advantage of station and event correlation structure. Here we present the preliminary phase of this work in which we use Nevada Test Site (NTS) nuclear explosions, with known locations, to test the effect of travel-time model accuracy on relative location accuracy. Like previous studies, we find that the reference velocity-model and relative-location accuracy are highly correlated. We also find that metrics based on travel-time residual of relocated events are not a reliable for assessing either velocity-model or relative-location accuracy. In the topical area of detection, we develop specialized correlation (subspace) detectors for the principal mines surrounding the ARCES station located in the European Arctic. Our objective is to provide efficient screens for explosions occurring in the mines of the Kola Peninsula (Kovdor

  20. Improved Regional Seismic Event Locations Using 3-D Velocity Models

    DTIC Science & Technology

    1999-12-15

    regional velocity model to estimate event hypocenters. Travel times for the regional phases are calculated using a sophisticated eikonal finite...can greatly improve estimates of event locations. Our algorithm calculates travel times using a finite difference approximation of the eikonal ...such as IASP91 or J-B. 3-D velocity models require more sophisticated travel time modeling routines; thus, we use a 3-D eikonal equation solver

  1. Leisure and Pleasure: Science events in unusual locations

    NASA Astrophysics Data System (ADS)

    Bultitude, Karen; Margarida Sardo, Ana

    2012-12-01

    Building on concepts relating to informal science education, this work compares science-related activities which successfully engaged public audiences at three different 'generic' locations: a garden festival, a public park, and a music festival. The purpose was to identify what factors contribute to the perceived success of science communication activities occurring within leisure spaces. This article reports the results of 71 short (2-3 min) structured interviews with public participants at the events, and 18 structured observations sessions, demonstrating that the events were considered both novel and interesting by the participants. Audience members were found to perceive both educational and affective purposes from the events. Three key elements were identified as contributing to the success of the activities across the three 'generic venues': the informality of the surroundings, the involvement of 'real' scientists, and the opportunity to re-engage participants with scientific concepts outside formal education.

  2. Simultaneous Determination of Structure and Event Location Using Body and Surface Wave Measurements at a Single Station: Preparation for Mars Data from the InSight Mission

    NASA Astrophysics Data System (ADS)

    Panning, M. P.; Banerdt, W. B.; Beucler, E.; Blanchette-Guertin, J. F.; Boese, M.; Clinton, J. F.; Drilleau, M.; James, S. R.; Kawamura, T.; Khan, A.; Lognonne, P. H.; Mocquet, A.; van Driel, M.

    2015-12-01

    An important challenge for the upcoming InSight mission to Mars, which will deliver a broadband seismic station to Mars along with other geophysical instruments in 2016, is to accurately determine event locations with the use of a single station. Locations are critical for the primary objective of the mission, determining the internal structure of Mars, as well as a secondary objective of measuring the activity of distribution of seismic events. As part of the mission planning process, a variety of techniques have been explored for location of marsquakes and inversion of structure, and preliminary procedures and software are already under development as part of the InSight Mars Quake and Mars Structure Services. One proposed method, involving the use of recordings of multiple-orbit surface waves, has already been tested with synthetic data and Earth recordings. This method has the strength of not requiring an a priori velocity model of Mars for quake location, but will only be practical for larger events. For smaller events where only first orbit surface waves and body waves are observable, other methods are required. In this study, we implement a transdimensional Bayesian inversion approach to simultaneously invert for basic velocity structure and location parameters (epicentral distance and origin time) using only measurements of body wave arrival times and dispersion of first orbit surface waves. The method is tested with synthetic data with expected Mars noise and Earth data for single events and groups of events and evaluated for errors in both location and structural determination, as well as tradeoffs between resolvable parameters and the effect of 3D crustal variations.

  3. "Repeating Events" as Estimator of Location Precision: The China National Seismograph Network

    NASA Astrophysics Data System (ADS)

    Jiang, Changsheng; Wu, Zhongliang; Li, Yutong; Ma, Tengfei

    2014-03-01

    "Repeating earthquakes" identified by waveform cross-correlation, with inter-event separation of no more than 1 km, can be used for assessment of location precision. Assuming that the network-measured apparent inter-epicenter distance X of the "repeating doublets" indicates the location precision, we estimated the regionalized location quality of the China National Seismograph Network by comparing the "repeating events" in and around China by S chaff and R ichards (Science 303: 1176-1178, 2004; J Geophys Res 116: B03309, 2011) and the monthly catalogue of the China Earthquake Networks Center. The comparison shows that the average X value of the China National Seismograph Network is approximately 10 km. The mis-location is larger for the Tibetan Plateau, west and north of Xinjiang, and east of Inner Mongolia, as indicated by larger X values. Mis-location is correlated with the completeness magnitude of the earthquake catalogue. Using the data from the Beijing Capital Circle Region, the dependence of the mis-location on the distribution of seismic stations can be further confirmed.

  4. Processing ser and estar to locate objects and events: An ERP study with L2 speakers of Spanish.

    PubMed

    Dussias, Paola E; Contemori, Carla; Román, Patricia

    2014-01-01

    In Spanish locative constructions, a different form of the copula is selected in relation to the semantic properties of the grammatical subject: sentences that locate objects require estar while those that locate events require ser (both translated in English as 'to be'). In an ERP study, we examined whether second language (L2) speakers of Spanish are sensitive to the selectional restrictions that the different types of subjects impose on the choice of the two copulas. Twenty-four native speakers of Spanish and two groups of L2 Spanish speakers (24 beginners and 18 advanced speakers) were recruited to investigate the processing of 'object/event + estar/ser ' permutations. Participants provided grammaticality judgments on correct (object + estar ; event + ser ) and incorrect (object + ser ; event + estar ) sentences while their brain activity was recorded. In line with previous studies (Leone-Fernández, Molinaro, Carreiras, & Barber, 2012; Sera, Gathje, & Pintado, 1999), the results of the grammaticality judgment for the native speakers showed that participants correctly accepted object + estar and event + ser constructions. In addition, while 'object + ser ' constructions were considered grossly ungrammatical, 'event + estar ' combinations were perceived as unacceptable to a lesser degree. For these same participants, ERP recording time-locked to the onset of the critical word ' en ' showed a larger P600 for the ser predicates when the subject was an object than when it was an event (*La silla es en la cocina vs. La fiesta es en la cocina). This P600 effect is consistent with syntactic repair of the defining predicate when it does not fit with the adequate semantic properties of the subject. For estar predicates (La silla está en la cocina vs. *La fiesta está en la cocina), the findings showed a central-frontal negativity between 500-700 ms. Grammaticality judgment data for the L2 speakers of Spanish showed that beginners were significantly less accurate than

  5. Temporal and Location Based RFID Event Data Management and Processing

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  6. The Mw=8.8 Maule earthquake aftershock sequence, event catalog and locations

    NASA Astrophysics Data System (ADS)

    Meltzer, A.; Benz, H.; Brown, L.; Russo, R. M.; Beck, S. L.; Roecker, S. W.

    2011-12-01

    The aftershock sequence of the Mw=8.8 Maule earthquake off the coast of Chile in February 2010 is one of the most well-recorded aftershock sequences from a great megathrust earthquake. Immediately following the Maule earthquake, teams of geophysicists from Chile, France, Germany, Great Britain and the United States coordinated resources to capture aftershocks and other seismic signals associated with this significant earthquake. In total, 91 broadband, 48 short period, and 25 accelerometers stations were deployed above the rupture zone of the main shock from 33-38.5°S and from the coast to the Andean range front. In order to integrate these data into a unified catalog, the USGS National Earthquake Information Center develop procedures to use their real-time seismic monitoring system (Bulletin Hydra) to detect, associate, location and compute earthquake source parameters from these stations. As a first step in the process, the USGS has built a seismic catalog of all M3.5 or larger earthquakes for the time period of the main aftershock deployment from March 2010-October 2010. The catalog includes earthquake locations, magnitudes (Ml, Mb, Mb_BB, Ms, Ms_BB, Ms_VX, Mc), associated phase readings and regional moment tensor solutions for most of the M4 or larger events. Also included in the catalog are teleseismic phases and amplitude measures and body-wave MT and CMT solutions for the larger events, typically M5.5 and larger. Tuning of automated detection and association parameters should allow a complete catalog of events to approximately M2.5 or larger for that dataset of more than 164 stations. We characterize the aftershock sequence in terms of magnitude, frequency, and location over time. Using the catalog locations and travel times as a starting point we use double difference techniques to investigate relative locations and earthquake clustering. In addition, phase data from candidate ground truth events and modeling of surface waves can be used to calibrate the

  7. A Place for Every Event and Every Event in Its Place: Memory for Locations and Activities by 4-Year-Old Children

    ERIC Educational Resources Information Center

    Bauer, Patricia J.; Stewart, Rebekah; White, Elizabeth A.; Larkina, Marina

    2016-01-01

    Episodic memories are of specific events and experiences associated with particular times and places. Whereas memory for the temporal aspects of past events has been a focus of research attention, memory for the location in which events were experienced has been less fully investigated. The limited developmental research suggests that…

  8. Persistent Daily Aerosol Nucleation Events at Mountain-Top Location

    NASA Astrophysics Data System (ADS)

    Hallar, A. G.; Wiedinmyer, C.; Lowenthal, D. H.

    2009-12-01

    Atmospheric aerosols are of great consequence since they can impact climate through direct and indirect forcing, degrade air quality and visibility, and have detrimental effects on human health. Thus, an important phenomenon is atmospheric aerosol formation, the production of nanometer-size particles by nucleation and their growth to detectable sizes. Storm Peak Laboratory (3210 m AMSL), owned and operated by the Desert Research Institute (DRI), is located on the west summit of Mt. Werner in the Park Range near Steamboat Springs in northwestern Colorado. This site has been used in aerosol studies for more than 20 years. Daily nucleation events have been observed Storm Peak Laboratory between 2002 and 2009 with a TSI Scanning Mobility Particle Sizer (SMPS) (model 3936) coupled with a TSI model 3022 condensation particle counter (CPC). This instrument was set to measure particles with diameters between 8 and 335 nm. These events were observed during all measurement periods in the spring, summer and winter months. Nucleation was consistently seen in the mid-afternoon each day. This study includes 422 days of data; in 320 of these days nucleation events were observed. Thus, the nucleation events occurred during 76% of the measurement days, including during cloud events, and appear to be associated with elevated levels of ultraviolet radiation. This work will compare and contrast days with and without nucleation events, by investigating the radiation and meteorological conditions present. The results presented will provide further insight to the insitu production of aerosols via nucleation.

  9. A study of various methods for calculating locations of lightning events

    NASA Technical Reports Server (NTRS)

    Cannon, John R.

    1995-01-01

    This article reports on the results of numerical experiments on finding the location of lightning events using different numerical methods. The methods include linear least squares, nonlinear least squares, statistical estimations, cluster analysis and angular filters and combinations of such techniques. The experiments involved investigations of methods for excluding fake solutions which are solutions that appear to be reasonable but are in fact several kilometers distant from the actual location. Some of the conclusions derived from the study are that bad data produces fakes, that no fool-proof method of excluding fakes was found, that a short base-line interferometer under development at Kennedy Space Center to measure the direction cosines of an event shows promise as a filter for excluding fakes. The experiments generated a number of open questions, some of which are discussed at the end of the report.

  10. Accurate Damage Location in Complex Composite Structures and Industrial Environments using Acoustic Emission

    NASA Astrophysics Data System (ADS)

    Eaton, M.; Pearson, M.; Lee, W.; Pullin, R.

    2015-07-01

    The ability to accurately locate damage in any given structure is a highly desirable attribute for an effective structural health monitoring system and could help to reduce operating costs and improve safety. This becomes a far greater challenge in complex geometries and materials, such as modern composite airframes. The poor translation of promising laboratory based SHM demonstrators to industrial environments forms a barrier to commercial up take of technology. The acoustic emission (AE) technique is a passive NDT method that detects elastic stress waves released by the growth of damage. It offers very sensitive damage detection, using a sparse array of sensors to detect and globally locate damage within a structure. However its application to complex structures commonly yields poor accuracy due to anisotropic wave propagation and the interruption of wave propagation by structural features such as holes and thickness changes. This work adopts an empirical mapping technique for AE location, known as Delta T Mapping, which uses experimental training data to account for such structural complexities. The technique is applied to a complex geometry composite aerospace structure undergoing certification testing. The component consists of a carbon fibre composite tube with varying wall thickness and multiple holes, that was loaded under bending. The damage location was validated using X-ray CT scanning and the Delta T Mapping technique was shown to improve location accuracy when compared with commercial algorithms. The onset and progression of damage were monitored throughout the test and used to inform future design iterations.

  11. Accurate characterisation of hole size and location by projected fringe profilometry

    NASA Astrophysics Data System (ADS)

    Wu, Yuxiang; Dantanarayana, Harshana G.; Yue, Huimin; Huntley, Jonathan M.

    2018-06-01

    The ability to accurately estimate the location and geometry of holes is often required in the field of quality control and automated assembly. Projected fringe profilometry is a potentially attractive technique on account of being non-contacting, of lower cost, and orders of magnitude faster than the traditional coordinate measuring machine. However, we demonstrate in this paper that fringe projection is susceptible to significant (hundreds of µm) measurement artefacts in the neighbourhood of hole edges, which give rise to errors of a similar magnitude in the estimated hole geometry. A mechanism for the phenomenon is identified based on the finite size of the imaging system’s point spread function and the resulting bias produced near to sample discontinuities in geometry and reflectivity. A mathematical model is proposed, from which a post-processing compensation algorithm is developed to suppress such errors around the holes. The algorithm includes a robust and accurate sub-pixel edge detection method based on a Fourier descriptor of the hole contour. The proposed algorithm was found to reduce significantly the measurement artefacts near the hole edges. As a result, the errors in estimated hole radius were reduced by up to one order of magnitude, to a few tens of µm for hole radii in the range 2–15 mm, compared to those from the uncompensated measurements.

  12. Improvement of IDC/CTBTO Event Locations in Latin America and the Caribbean Using a Regional Seismic Travel Time Model

    NASA Astrophysics Data System (ADS)

    Given, J. W.; Guendel, F.

    2013-05-01

    The International Data Centre is a vital element of the Comprehensive Test Ban Treaty (CTBT) verification mechanism. The fundamental mission of the International Data Centre (IDC) is to collect, process, and analyze monitoring data and to present results as event bulletins to Member States. For the IDC and in particular for waveform technologies, a key measure of the quality of its products is the accuracy by which every detected event is located. Accurate event location is crucial for purposes of an On Site Inspection (OSI), which would confirm the conduct of a nuclear test. Thus it is important for the IDC monitoring and data analysis to adopt new processing algorithms that improve the accuracy of event location. Among them the development of new algorithms to compute regional seismic travel times through 3-dimensional models have greatly increased IDC's location precision, the reduction of computational time, allowing forward and inverse modeling of large data sets. One of these algorithms has been the Regional Seismic Travel Time model (RSTT) of Myers et al., (2011). The RSTT model is nominally a global model; however, it currently covers only North America and Eurasia in sufficient detail. It is the intention CTBTO's Provisional Technical Secretariat and the IDC to extend the RSTT model to other regions of the earth, e.g. Latin America-Caribbean, Africa and Asia. This is particularly important for the IDC location procedure, as there are regions of the earth for which crustal models are not well constrained. For this purpose IDC has launched a RSTT initiative. In May 2012, a technical meeting was held in Vienna under the auspices of the CTBTO. The purpose of this meeting was to invite National Data Centre experts as well as network operators from Africa, Europe, the Middle East, Asia, Australia, Latin and North America to discuss the context under which a project to extend the RSTT model would be implemented. A total of 41 participants from 32 Member States

  13. Lower Learning Difficulty and Fluoroscopy Reduction of Transforaminal Percutaneous Endoscopic Lumbar Discectomy with an Accurate Preoperative Location Method.

    PubMed

    Fan, Guoxin; Gu, Xin; Liu, Yifan; Wu, Xinbo; Zhang, Hailong; Gu, Guangfei; Guan, Xiaofei; He, Shisheng

    2016-01-01

    Transforaminal percutaneous endoscopic lumbar discectomy (tPELD) poses great challenges for junior surgeons. Beginners often require repeated attempts using fluoroscopy causing more punctures, which may significantly undermine their confidence and increase the radiation exposure to medical staff and patients. Moreover, the impact of an accurate location on the learning curve of tPELD has not been defined. The study aimed to investigate the impact of an accurate preoperative location method on learning difficulty and fluoroscopy time of tPELD. Retrospective evaluation. Patients receiving tPELD by one surgeon with a novel accurate preoperative location method were regarded as Group A, and those receiving tPELD by another surgeon with a conventional fluoroscopy method were regarded as Group B. From January 2012 to August 2014, we retrospectively reviewed the first 80 tPELD cases conducted by 2 junior surgeons. The operation time, fluoroscopy times, preoperative location time, and puncture-channel time were thoroughly analyzed. The operation time of the first 20 patients were 99.75 ± 10.38 minutes in Group A and 115.7 ± 16.46 minutes in Group B, while the operation time of all 80 patients was 88.36 ± 11.56 minutes in Group A and 98.26 ± 14.90 minutes in Group B. Significant differences were detected in operation time between the 2 groups, both for the first 20 patients and total 80 patients (P < 0.05). The fluoroscopy times were 26.78 ± 4.17 in Group A and 33.98 ± 2.69 in Group B (P < 0.001). The preoperative location time was 3.43 ± 0.61 minutes in Group A and 5.59 ± 1.46 minutes in Group B (P < 0.001). The puncture-channel time was 27.20 ± 4.49 minutes in Group A and 34.64 ± 8.35 minutes in Group B (P < 0.001). There was a moderate correlation between preoperative location time and puncture-channel time (r = 0.408, P < 0.001), and a moderate correlation between preoperative location time and fluoroscopy times (r = 0.441, P < 0.001). Mild correlations were

  14. Locating seismicity on the Arctic plate boundary using multiple-event techniques and empirical signal processing

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Harris, D. B.; Dahl-Jensen, T.; Kværna, T.; Larsen, T. B.; Paulsen, B.; Voss, P. H.

    2017-12-01

    The oceanic boundary separating the Eurasian and North American plates between 70° and 84° north hosts large earthquakes which are well recorded teleseismically, and many more seismic events at far lower magnitudes that are well recorded only at regional distances. Existing seismic bulletins have considerable spread and bias resulting from limited station coverage and deficiencies in the velocity models applied. This is particularly acute for the lower magnitude events which may only be constrained by a small number of Pn and Sn arrivals. Over the past two decades there has been a significant improvement in the seismic network in the Arctic: a difficult region to instrument due to the harsh climate, a sparsity of accessible sites (particularly at significant distances from the sea), and the expense and difficult logistics of deploying and maintaining stations. New deployments and upgrades to stations on Greenland, Svalbard, Jan Mayen, Hopen, and Bjørnøya have resulted in a sparse but stable regional seismic network which results in events down to magnitudes below 3 generating high-quality Pn and Sn signals on multiple stations. A catalogue of several hundred events in the region since 1998 has been generated using many new phase readings on stations on both sides of the spreading ridge in addition to teleseismic P phases. A Bayesian multiple event relocation has resulted in a significant reduction in the spread of hypocentre estimates for both large and small events. Whereas single event location algorithms minimize vectors of time residuals on an event-by-event basis, the Bayesloc program finds a joint probability distribution of origins, hypocentres, and corrections to traveltime predictions for large numbers of events. The solutions obtained favour those event hypotheses resulting in time residuals which are most consistent over a given source region. The relocations have been performed with different 1-D velocity models applicable to the Arctic region and

  15. Location of the Green Canyon (Offshore Southern Louisiana) Seismic Event of February 10, 2006

    USGS Publications Warehouse

    Dewey, James W.; Dellinger, Joseph A.

    2008-01-01

    We calculated an epicenter for the Offshore Southern Louisiana seismic event of February 10, 2006 (the 'Green Canyon event') that was adopted as the preferred epicenter for the event by the USGS/NEIC. The event is held at a focal depth of 5 km; the focal depth could not be reliably calculated but was most likely between 1 km and 15 km beneath sea level. The epicenter was calculated with a radially symmetric global Earth model similar to that routinely used at the USGS/NEIC for all earthquakes worldwide. The location was calculated using P-waves recorded by seismographic stations from which the USGS/NEIC routinely obtains seismological data, plus data from two seismic exploration arrays, the Atlantis ocean-bottom node array, operated by BP in partnership with BHP Billiton Limited, and the CGG Green Canyon phase VIII multi-client towed-streamer survey. The preferred epicenter is approximately 26 km north of an epicenter earlier published by the USGS/NEIC, which was obtained without benefit of the seismic exploration arrays. We estimate that the preferred epicenter is accurate to within 15 km. We selected the preferred epicenter from a suite of trial calculations that attempted to fit arrival times of seismic energy associated with the Green Canyon event and that explored the effect of errors in the velocity model used to calculate the preferred epicenter. The various trials were helpful in confirming the approximate correctness of the preferred epicenter and in assessing the accuracy of the preferred epicenter, but none of the trial calculations, including that of the preferred epicenter, was able to reconcile arrival-time observations and assumed velocity model as well as is typical for the vast majority of earthquakes in and near the continental United States. We believe that remaining misfits between the preferred solution and the observations reflect errors in interpreted arrival times of emergent seismic phases that are due partly to a temporally extended source

  16. Microseismic event location using global optimization algorithms: An integrated and automated workflow

    NASA Astrophysics Data System (ADS)

    Lagos, Soledad R.; Velis, Danilo R.

    2018-02-01

    We perform the location of microseismic events generated in hydraulic fracturing monitoring scenarios using two global optimization techniques: Very Fast Simulated Annealing (VFSA) and Particle Swarm Optimization (PSO), and compare them against the classical grid search (GS). To this end, we present an integrated and optimized workflow that concatenates into an automated bash script the different steps that lead to the microseismic events location from raw 3C data. First, we carry out the automatic detection, denoising and identification of the P- and S-waves. Secondly, we estimate their corresponding backazimuths using polarization information, and propose a simple energy-based criterion to automatically decide which is the most reliable estimate. Finally, after taking proper care of the size of the search space using the backazimuth information, we perform the location using the aforementioned algorithms for 2D and 3D usual scenarios of hydraulic fracturing processes. We assess the impact of restricting the search space and show the advantages of using either VFSA or PSO over GS to attain significant speed-ups.

  17. A new protocol to accurately determine microtubule lattice seam location

    DOE PAGES

    Zhang, Rui; Nogales, Eva

    2015-09-28

    Microtubules (MTs) are cylindrical polymers of αβ-tubulin that display pseudo-helical symmetry due to the presence of a lattice seam of heterologous lateral contacts. The structural similarity between α- and β-tubulin makes it difficult to computationally distinguish them in the noisy cryo-EM images, unless a marker protein for the tubulin dimer, such as kinesin motor domain, is present. We have developed a new data processing protocol that can accurately determine αβ-tubulin register and seam location for MT segments. Our strategy can handle difficult situations, where the marker protein is relatively small or the decoration of marker protein is sparse. Using thismore » new seam-search protocol, combined with movie processing for data from a direct electron detection camera, we were able to determine the cryo-EM structures of MT at 3.5. Å resolution in different functional states. The successful distinction of α- and β-tubulin allowed us to visualize the nucleotide state at the E-site and the configuration of lateral contacts at the seam.« less

  18. A method for detecting and locating geophysical events using groups of arrays

    NASA Astrophysics Data System (ADS)

    de Groot-Hedlin, Catherine D.; Hedlin, Michael A. H.

    2015-11-01

    We have developed a novel method to detect and locate geophysical events that makes use of any sufficiently dense sensor network. This method is demonstrated using acoustic sensor data collected in 2013 at the USArray Transportable Array (TA). The algorithm applies Delaunay triangulation to divide the sensor network into a mesh of three-element arrays, called triads. Because infrasound waveforms are incoherent between the sensors within each triad, the data are transformed into envelopes, which are cross-correlated to find signals that satisfy a consistency criterion. The propagation azimuth, phase velocity and signal arrival time are computed for each signal. Triads with signals that are consistent with a single source are bundled as an event group. The ensemble of arrival times and azimuths of detected signals within each group are used to locate a common source in space and time. A total of 513 infrasonic stations that were active for part or all of 2013 were divided into over 2000 triads. Low (0.5-2 Hz) and high (2-8 Hz) catalogues of infrasonic events were created for the eastern USA. The low-frequency catalogue includes over 900 events and reveals several highly active source areas on land that correspond with coal mining regions. The high-frequency catalogue includes over 2000 events, with most occurring offshore. Although their cause is not certain, most events are clearly anthropogenic as almost all occur during regular working hours each week. The regions to which the TA is most sensitive vary seasonally, with the direction of reception dependent on the direction of zonal winds. The catalogue has also revealed large acoustic events that may provide useful insight into the nature of long-range infrasound propagation in the atmosphere.

  19. 76 FR 31843 - Safety Zone; Temporary Change to Enforcement Location of Recurring Fireworks Display Event...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ...-AA00 Safety Zone; Temporary Change to Enforcement Location of Recurring Fireworks Display Event... Guard is temporarily changing the enforcement location of a safety zone for one specific recurring... originated from a barge but will this year originate from a location on land. The safety zone is necessary to...

  20. Microseismic Event Location Improvement Using Adaptive Filtering for Noise Attenuation

    NASA Astrophysics Data System (ADS)

    de Santana, F. L., Sr.; do Nascimento, A. F.; Leandro, W. P. D. N., Sr.; de Carvalho, B. M., Sr.

    2017-12-01

    In this work we show how adaptive filtering noise suppression improves the effectiveness of the Source Scanning Algorithm (SSA; Kao & Shan, 2004) in microseism location in the context of fracking operations. The SSA discretizes the time and region of interest in a 4D vector and, for each grid point and origin time, a brigthness value (seismogram stacking) is calculated. For a given set of velocity model parameters, when origin time and hypocenter of the seismic event are correct, a maximum value for coherence (or brightness) is achieved. The result is displayed on brightness maps for each origin time. Location methods such as SSA are most effective when the noise present in the seismograms is incoherent, however, the method may present false positives when the noise present in the data is coherent as occurs in fracking operations. To remove from the seismograms, the coherent noise from the pump and engines used in the operation, we use an adaptive filter. As the noise reference, we use the seismogram recorded at the station closest to the machinery employed. Our methodology was tested on semi-synthetic data. The microseismic was represented by Ricker pulses (with central frequency of 30Hz) on synthetics seismograms, and to simulate real seismograms on a surface microseismic monitoring situation, we added real noise recorded in a fracking operation to these synthetics seismograms. The results show that after the filtering of the seismograms, we were able to improve our detection threshold and to achieve a better resolution on the brightness maps of the located events.

  1. The use of propagation path corrections to improve regional seismic event location in western China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steck, L.K.; Cogbill, A.H.; Velasco, A.A.

    1999-03-01

    In an effort to improve the ability to locate seismic events in western China using only regional data, the authors have developed empirical propagation path corrections (PPCs) and applied such corrections using both traditional location routines as well as a nonlinear grid search method. Thus far, the authors have concentrated on corrections to observed P arrival times for shallow events using travel-time observations available from the USGS EDRs, the ISC catalogs, their own travel-tim picks from regional data, and data from other catalogs. They relocate events with the algorithm of Bratt and Bache (1988) from a region encompassing China. Formore » individual stations having sufficient data, they produce a map of the regional travel-time residuals from all well-located teleseismic events. From these maps, interpolated PPC surfaces have been constructed using both surface fitting under tension and modified Bayesian kriging. The latter method offers the advantage of providing well-behaved interpolants, but requires that the authors have adequate error estimates associated with the travel-time residuals. To improve error estimates for kriging and event location, they separate measurement error from modeling error. The modeling error is defined as the travel-time variance of a particular model as a function of distance, while the measurement error is defined as the picking error associated with each phase. They estimate measurement errors for arrivals from the EDRs based on roundoff or truncation, and use signal-to-noise for the travel-time picks from the waveform data set.« less

  2. Development of double-pair double difference location algorithm and its application to the regular earthquakes and non-volcanic tremors

    NASA Astrophysics Data System (ADS)

    Guo, H.; Zhang, H.

    2016-12-01

    Relocating high-precision earthquakes is a central task for monitoring earthquakes and studying the structure of earth's interior. The most popular location method is the event-pair double-difference (DD) relative location method, which uses the catalog and/or more accurate waveform cross-correlation (WCC) differential times from event pairs with small inter-event separations to the common stations to reduce the effect of the velocity uncertainties outside the source region. Similarly, Zhang et al. [2010] developed a station-pair DD location method which uses the differential times from common events to pairs of stations to reduce the effect of the velocity uncertainties near the source region, to relocate the non-volcanic tremors (NVT) beneath the San Andreas Fault (SAF). To utilize advantages of both DD location methods, we have proposed and developed a new double-pair DD location method to use the differential times from pairs of events to pairs of stations. The new method can remove the event origin time and station correction terms from the inversion system and cancel out the effects of the velocity uncertainties near and outside the source region simultaneously. We tested and applied the new method on the northern California regular earthquakes to validate its performance. In comparison, among three DD location methods, the new double-pair DD method can determine more accurate relative locations and the station-pair DD method can better improve the absolute locations. Thus, we further proposed a new location strategy combining station-pair and double-pair differential times to determine accurate absolute and relative locations at the same time. For NVTs, it is difficult to pick the first arrivals and derive the WCC event-pair differential times, thus the general practice is to measure station-pair envelope WCC differential times. However, station-pair tremor locations are scattered due to the low-precision relative locations. The ability that double-pair data

  3. Accurate seismic phase identification and arrival time picking of glacial icequakes

    NASA Astrophysics Data System (ADS)

    Jones, G. A.; Doyle, S. H.; Dow, C.; Kulessa, B.; Hubbard, A.

    2010-12-01

    A catastrophic lake drainage event was monitored continuously using an array of 6, 4.5 Hz 3 component geophones in the Russell Glacier catchment, Western Greenland. Many thousands of events and arrival time phases (e.g., P- or S-wave) were recorded, often with events occurring simultaneously but at different locations. In addition, different styles of seismic events were identified from 'classical' tectonic earthquakes to tremors usually observed in volcanic regions. The presence of such a diverse and large dataset provides insight into the complex system of lake drainage. One of the most fundamental steps in seismology is the accurate identification of a seismic event and its associated arrival times. However, the collection of such a large and complex dataset makes the manual identification of a seismic event and picking of the arrival time phases time consuming with variable results. To overcome the issues of consistency and manpower, a number of different methods have been developed including short-term and long-term averages, spectrograms, wavelets, polarisation analyses, higher order statistics and auto-regressive techniques. Here we propose an automated procedure which establishes the phase type and accurately determines the arrival times. The procedure combines a number of different automated methods to achieve this, and is applied to the recently acquired lake drainage data. Accurate identification of events and their arrival time phases are the first steps in gaining a greater understanding of the extent of the deformation and the mechanism of such drainage events. A good knowledge of the propagation pathway of lake drainage meltwater through a glacier will have significant consequences for interpretation of glacial and ice sheet dynamics.

  4. Challenges in Locating Microseismic Events Using Distributed Acoustic Sensors

    NASA Astrophysics Data System (ADS)

    Williams, A.; Kendall, J. M.; Clarke, A.; Verdon, J.

    2017-12-01

    Microseismic monitoring is an important method of assessing the behaviour of subsurface fluid processes, and is commonly acquired using geophone arrays in boreholes or on the surface. A new alternative technology has been recently developed - fibre-optic Distributed Acoustic Sensing (DAS) - using strain along a fibre-optic cable as a measure of seismic signals. DAS can offer high density arrays and full-well coverage from the surface to bottom, with less overall disruption to operations, so there are many exciting possible applications in monitoring both petroleum and other subsurface industries. However, there are challenges in locating microseismic events recorded using current DAS systems, which only record seismic data in one-component and consequently omit the azimuthal information provided by a three-component geophone. To test the impact of these limitations we used finite difference modelling to generate one-component synthetic DAS datasets and investigated the impact of picking solely P-wave or both P- and S-wave arrivals and the impact of different array geometries. These are then compared to equivalent 3-component synthetic geophone datasets. In simple velocity models, P-wave arrivals along linear arrays cannot be used to constrain locations using DAS, without further a priori information. We then tested the impact of straight cables vs. L-shaped arrays and found improved locations when the cable is deviated, especially when both P- and S-wave picks are included. There is a trade-off between the added coverage of DAS cables versus sparser 3C geophone arrays where particle motion helps constrains locations, which cannot be assessed without forward modelling.

  5. Epicenter Location of Regional Seismic Events Using Love Wave and Rayleigh Wave Ambient Seismic Noise Green's Functions

    NASA Astrophysics Data System (ADS)

    Levshin, A. L.; Barmin, M. P.; Moschetti, M. P.; Mendoza, C.; Ritzwoller, M. H.

    2011-12-01

    We describe a novel method to locate regional seismic events based on exploiting Empirical Green's Functions (EGF) that are produced from ambient seismic noise. Elastic EGFs between pairs of seismic stations are determined by cross-correlating long time-series of ambient noise recorded at the two stations. The EGFs principally contain Rayleigh waves on the vertical-vertical cross-correlations and Love waves on the transverse-transverse cross-correlations. Earlier work (Barmin et al., "Epicentral location based on Rayleigh wave empirical Green's functions from ambient seismic noise", Geophys. J. Int., 2011) showed that group time delays observed on Rayleigh wave EGFs can be exploited to locate to within about 1 km moderate sized earthquakes using USArray Transportable Array (TA) stations. The principal advantage of the method is that the ambient noise EGFs are affected by lateral variations in structure similarly to the earthquake signals, so the location is largely unbiased by 3-D structure. However, locations based on Rayleigh waves alone may be biased by more than 1 km if the earthquake depth is unknown but lies between 2 km and 7 km. This presentation is motivated by the fact that group time delays for Love waves are much less affected by earthquake depth than Rayleigh waves; thus exploitation of Love wave EGFs may reduce location bias caused by uncertainty in event depth. The advantage of Love waves to locate seismic events, however, is mitigated by the fact that Love wave EGFs have a smaller SNR than Rayleigh waves. Here, we test the use of Love and Rayleigh wave EGFs between 5- and 15-sec period to locate seismic events based on the USArray TA in the western US. We focus on locating aftershocks of the 2008 M 6.0 Wells earthquake, mining blasts in Wyoming and Montana, and small earthquakes near Norman, OK and Dallas, TX, some of which may be triggered by hydrofracking or injection wells.

  6. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE PAGES

    Butler, Troy; Wildey, Timothy

    2018-01-01

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  7. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, Troy; Wildey, Timothy

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  8. Location of Microearthquakes in Various Noisy Environments Using Envelope Stacking

    NASA Astrophysics Data System (ADS)

    Oye, V.; Gharti, H.

    2009-12-01

    Monitoring of microearthquakes is routinely conducted in various environments such as hydrocarbon and geothermal reservoirs, mines, dams, seismically active faults, volcanoes, nuclear power plants and CO2 storages. In many of these cases the handled data is sensitive and the interpretation of the data may be vital. In some cases, such as during mining or hydraulic fracturing activities, the number of microearthquakes is very large with tens to thousands of events per hour. In others, almost no events occur during a week and furthermore, it might not be anticipated that many events occur at all. However, the general setup of seismic networks, including surface and downhole stations, is usually optimized to record as many microearthquakes as possible, thereby trying to lower the detection threshold of the network. This process is obviously limited to some extent. Most microearthquake location techniques take advantage of a combination of P- and S-wave onset times that often can be picked reliably in an automatic mode. Moreover, when using seismic wave onset times, sometimes in combination with seismic wave polarization, these methods are more accurate compared to migration-based location routines. However, many events cannot be located because their magnitude is too small, i.e. the P- and/or S-wave onset times cannot be picked accurately on a sufficient number of receivers. Nevertheless, these small events are important for the interpretation of the processes that are monitored and even an inferior estimate of event locations and strengths is valuable information. Moreover, the smaller the event the more often such events statistically occur and the more important such additional information becomes. In this study we try to enhance the performance of any microseismic network, providing additional estimates of event locations below the actual detection threshold. We present a migration-based event location method, where we project the recorded seismograms onto the ray

  9. Improved phase arrival estimate and location for local earthquakes in South Korea

    NASA Astrophysics Data System (ADS)

    Morton, E. A.; Rowe, C. A.; Begnaud, M. L.

    2012-12-01

    The Korean Institute of Geoscience and Mineral Resources (KIGAM) and the Korean Meteorological Agency (KMA) regularly report local (distance < ~1200 km) seismicity recorded with their networks; we obtain preliminary event location estimates as well as waveform data, but no phase arrivals are reported, so the data are not immediately useful for earthquake location. Our goal is to identify seismic events that are sufficiently well-located to provide accurate seismic travel-time information for events within the KIGAM and KMA networks, and also recorded by some regional stations. Toward that end, we are using a combination of manual phase identification and arrival-time picking, with waveform cross-correlation, to cluster events that have occurred in close proximity to one another, which allows for improved phase identification by comparing the highly correlating waveforms. We cross-correlate the known events with one another on 5 seismic stations and cluster events that correlate above a correlation coefficient threshold of 0.7, which reveals few clusters containing few events each. The small number of repeating events suggests that the online catalogs have had mining and quarry blasts removed before publication, as these can contribute significantly to repeating seismic sources in relatively aseismic regions such as South Korea. The dispersed source locations in our catalog, however, are ideal for seismic velocity modeling by providing superior sampling through the dense seismic station arrangement, which produces favorable event-to-station ray path coverage. Following careful manual phase picking on 104 events chosen to provide adequate ray coverage, we re-locate the events to obtain improved source coordinates. The re-located events are used with Thurber's Simul2000 pseudo-bending local tomography code to estimate the crustal structure on the Korean Peninsula, which is an important contribution to ongoing calibration for events of interest in the region.

  10. Lightning Prediction using Electric Field Measurements Associated with Convective Events at a Tropical Location

    NASA Astrophysics Data System (ADS)

    Jana, S.; Chakraborty, R.; Maitra, A.

    2017-12-01

    Nowcasting of lightning activities during intense convective events using a single electric field monitor (EFM) has been carried out at a tropical location, Kolkata (22.65oN, 88.45oE). Before and at the onset of heavy lightning, certain changes of electric field (EF) can be related to high liquid water content (LWC) and low cloud base height (CBH). The present study discusses the utility of EF observation to show a few aspects of convective events. Large convective cloud showed by high LWC and low CBH can be detected from EF variation which could be a precursor of upcoming convective events. Suitable values of EF gradient can be used as an indicator of impending lightning events. An EF variation of 0.195 kV/m/min can predict lightning within 17.5 km radius with a probability of detection (POD) of 91% and false alarm rate (FAR) of 8% with a lead time of 45 min. The total number of predicted lightning strikes is nearly 9 times less than that measured by the lightning detector. This prediction technique can, therefore, give an estimate of cloud to ground (CG) and intra cloud (IC) lighting occurrences within the surrounding area. This prediction technique involving POD, FAR and lead time information shows a better prediction capability compared to the techniques reported earlier. Thus an EFM can be effectively used for prediction of lightning events at a tropical location.

  11. Photographic Analysis Technique for Assessing External Tank Foam Loss Events

    NASA Technical Reports Server (NTRS)

    Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.

  12. A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

    DOE PAGES

    Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; ...

    2015-10-01

    Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phasesmore » are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.« less

  13. Location-based technologies for supporting elderly pedestrian in "getting lost" events.

    PubMed

    Pulido Herrera, Edith

    2017-05-01

    Localization-based technologies promise to keep older adults with dementia safe and support them and their caregivers during getting lost events. This paper summarizes mainly technological contributions to support the target group in these events. Moreover, important aspects of the getting lost phenomenon such as its concept and ethical issues are also briefly addressed. Papers were selected from scientific databases and gray literature. Since the topic is still in its infancy, other terms were used to find contributions associated with getting lost e.g. wandering. Trends of applying localization systems were identified as personal locators, perimeter systems and assistance systems. The first system barely considered the older adult's opinion, while assistance systems may involve context awareness to improve the support for both the elderly and the caregiver. Since few studies report multidisciplinary work with a special focus on getting lost, there is not a strong evidence of the real efficiency of localization systems or guidelines to design systems for the target group. Further research about getting lost is required to obtain insights for developing customizable systems. Moreover, considering conditions of the older adult might increase the impact of developments that combine localization technologies and artificial intelligence techniques. Implications for Rehabilitation Whilst there is no cure for dementia such as Alzheimer's, it is feasible to take advantage of technological developments to somewhat diminish its negative impact. For instance, location-based systems may provide information to early diagnose the Alzheimer's disease by assessing navigational impairments of older adults. Assessing the latest supportive technologies and methodologies may provide insights to adopt strategies to properly manage getting lost events. More user-centered designs will provide appropriate assistance to older adults. Namely, customizable systems could assist older adults

  14. Multiple-Array Detection, Association and Location of Infrasound and Seismo-Acoustic Events - Utilization of Ground Truth Information

    DTIC Science & Technology

    2010-09-01

    MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC EVENTS – UTILIZATION OF GROUND TRUTH INFORMATION Stephen J...and infrasound data from seismo-acoustic arrays and apply the methodology to regional networks for validation with ground truth information. In the...initial year of the project automated techniques for detecting, associating and locating infrasound signals were developed. Recently, the location

  15. Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory

    NASA Astrophysics Data System (ADS)

    Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi

    2018-03-01

    With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.

  16. Sensitive and accurate identification of protein–DNA binding events in ChIP-chip assays using higher order derivative analysis

    PubMed Central

    Barrett, Christian L.; Cho, Byung-Kwan

    2011-01-01

    Immuno-precipitation of protein–DNA complexes followed by microarray hybridization is a powerful and cost-effective technology for discovering protein–DNA binding events at the genome scale. It is still an unresolved challenge to comprehensively, accurately and sensitively extract binding event information from the produced data. We have developed a novel strategy composed of an information-preserving signal-smoothing procedure, higher order derivative analysis and application of the principle of maximum entropy to address this challenge. Importantly, our method does not require any input parameters to be specified by the user. Using genome-scale binding data of two Escherichia coli global transcription regulators for which a relatively large number of experimentally supported sites are known, we show that ∼90% of known sites were resolved to within four probes, or ∼88 bp. Over half of the sites were resolved to within two probes, or ∼38 bp. Furthermore, we demonstrate that our strategy delivers significant quantitative and qualitative performance gains over available methods. Such accurate and sensitive binding site resolution has important consequences for accurately reconstructing transcriptional regulatory networks, for motif discovery, for furthering our understanding of local and non-local factors in protein–DNA interactions and for extending the usefulness horizon of the ChIP-chip platform. PMID:21051353

  17. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    -velocity lithospheric slab. In application, JHD has the practical advantage that it does not require the specification of a theoretical velocity model for the slab. Considering earthquakes within a 260 km long by 60 km wide section of the Aleutian main thrust zone, our results suggest that the theoretical velocity structure of the slab is presently not sufficiently well known that accurate locations can be obtained independently of locally recorded data. Using a locally recorded earthquake as a calibration event, JHD gave excellent results over the entire section of the main thrust zone here studied, without showing a strong effect that might be attributed to spatially varying source-station anomalies. We also calibrated the ray-tracing method using locally recorded data and obtained results generally similar to those obtained by JHD. ?? 1982.

  18. Locating low-frequency earthquakes using amplitude signals from seismograph stations: Examples from events at Montserrat, West Indies and from synthetic data

    NASA Astrophysics Data System (ADS)

    Jolly, A.; Jousset, P.; Neuberg, J.

    2003-04-01

    We determine locations for low-frequency earthquakes occurring prior to a collapse on June 25th, 1997 using signal amplitudes from a 7-station local seismograph network at the Soufriere Hills volcano on Montserrat, West Indies. Locations are determined by averaging the signal amplitude over the event waveform and inverting these data using an assumed amplitude decay model comprising geometrical spreading and attenuation. Resulting locations are centered beneath the active dome from 500 to 2000 m below sea level assuming body wave geometrical spreading and a quality factor of Q=22. Locations for the same events shifted systematically shallower by about 500 m assuming a surface wave geometrical spreading. Locations are consistent to results obtained using arrival time methods. The validity of the method is tested against synthetic low-frequency events constructed from a 2-D finite difference model including visco-elastic properties. Two example events are tested; one from a point source triggered in a low velocity conduit ranging between 100-1100 m below the surface, and the second triggered in a conduit located 1500-2500 m below the surface. Resulting seismograms have emergent onsets and extended codas and include the effect of conduit resonance. Employing geometrical spreading and attenuation from the finite-difference modelling, we obtain locations within the respective model conduits validating our approach.The location depths are sensitive to the assumed geometric spreading and Q model. We can distinguish between two sources separated by about 1000 meters only if we know the decay parameters.

  19. Location of intense electromagnetic ion cyclotron (EMIC) wave events relative to the plasmapause: Van Allen Probes observations

    NASA Astrophysics Data System (ADS)

    Tetrick, S. S.; Engebretson, M. J.; Posch, J. L.; Olson, C. N.; Smith, C. W.; Denton, R. E.; Thaller, S. A.; Wygant, J. R.; Reeves, G. D.; MacDonald, E. A.; Fennell, J. F.

    2017-04-01

    We have studied the spatial location relative to the plasmapause (PP) of the most intense electromagnetic ion cyclotron (EMIC) waves observed on Van Allen Probes A and B during their first full precession in local time. Most of these waves occurred over an L range of from -1 to +2 RE relative to the PP. Very few events occurred only within 0.1 RE of the PP, and events with a width in L of < 0.2 RE occurred both inside and outside the PP. Wave occurrence was always associated with high densities of ring current ions; plasma density gradients or enhancements were associated with some events but were not dominant factors in determining the sites of wave generation. Storm main and recovery phase events in the dusk sector were often inside the PP, and dayside events during quiet times and compressions of the magnetosphere were more evenly distributed both inside and outside the PP. Superposed epoch analyses of the dependence of wave onset on solar wind dynamic pressure (Psw), the SME (SuperMAG auroral electrojet) index, and the SYM-H index showed that substorm injections and solar wind compressions were temporally closely associated with EMIC wave onset but to an extent that varied with frequency band, magnetic local time, and storm phase, and location relative to the PP. The fact that increases in SME and Psw were less strongly correlated with events at the PP than with other events might suggest that the occurrence of those events was affected by the density gradient.

  20. Testing the Quick Seismic Event Locator and Magnitude Calculator (SSL_Calc) by Marsite Project Data Base

    NASA Astrophysics Data System (ADS)

    Tunc, Suleyman; Tunc, Berna; Caka, Deniz; Baris, Serif

    2016-04-01

    Locating and calculating size of the seismic events is quickly one of the most important and challenging issue in especially real time seismology. In this study, we developed a Matlab application to locate seismic events and calculate their magnitudes (Local Magnitude and empirical Moment Magnitude) using single station called SSL_Calc. This newly developed sSoftware has been tested on the all stations of the Marsite project "New Directions in Seismic Hazard Assessment through Focused Earth Observation in the Marmara Supersite-MARsite". SSL_Calc algorithm is suitable both for velocity and acceleration sensors. Data has to be in GCF (Güralp Compressed Format). Online or offline data can be selected in SCREAM software (belongs to Guralp Systems Limited) and transferred to SSL_Calc. To locate event P and S wave picks have to be marked by using SSL_Calc window manually. During magnitude calculation, instrument correction has been removed and converted to real displacement in millimeter. Then the displacement data is converted to Wood Anderson Seismometer output by using; Z=[0;0]; P=[-6.28+4.71j; -6.28-4.71j]; A0=[2080] parameters. For Local Magnitude calculation,; maximum displacement amplitude (A) and distance (dist) are used in formula (1) for distances up to 200km and formula (2) for more than 200km. ML=log10(A)-(-1.118-0.0647*dist+0.00071*dist2-3.39E-6*dist3+5.71e-9*dist4) (1) ML=log10(A)+(2.1173+0.0082*dist-0.0000059628*dist2) (2) Following Local Magnitude calculation, the programcode calculates two empiric Moment Magnitudes using formulas (3) Akkar et al. (2010) and (4) Ulusay et al. (2004). Mw=0.953* ML+0.422 (3) Mw=0.7768* ML+1.5921 (4) SSL_Calc is a software that is easy to implement and user friendly and offers practical solution to individual users to location of event and ML, Mw calculation.

  1. Storm and Substorm Causes and Effects at Midlatitude Location for the St. Patrick's 2013 and 2015 Events

    NASA Astrophysics Data System (ADS)

    Guerrero, A.; Palacios, J.; Rodríguez-Bouza, M.; Rodríguez-Bilbao, I.; Aran, A.; Cid, C.; Herraiz, M.; Saiz, E.; Rodríguez-Caderot, G.; Cerrato, Y.

    2017-10-01

    Midlatitude locations are unique regions exposed to both geomagnetic storm and substorm effects, which may be superposed on specific events imposing an extra handicap for the analysis and identification of the sources and triggers. We study space weather effects at the midlatitude location of the Iberian Peninsula for the St. Patrick's day events in 2013 and 2015. We have been able to identify and separate storm and substorm effects on ground magnetometer data from San Pablo-Toledo observatory during storm time revealing important contributions of the Substorm Current Wedge on both events. The analysis of these substorm local signatures have shown to be related to the production of effective geomagnetically induced currents and ionospheric disturbances as measured from Global Navigation Satellite Systems data at MAD2 IGS permanent station and not directly related to the storm main phase. The whole Sun-to-Earth chain has been analyzed in order to identify the solar and interplanetary triggers. In both events a high-speed stream (HSS) and a coronal mass ejections (CME) are involved, though for 2015 event, the HSS has merged with the CME, increasing the storm geoeffectiveness. The enhancement of substorm geoeffectiveness is justified by the effects of the inclined magnetic axes of the Sun and of the Earth during equinox period.

  2. The role of spatial selective attention in working memory for locations: evidence from event-related potentials.

    PubMed

    Awh, E; Anllo-Vento, L; Hillyard, S A

    2000-09-01

    We investigated the hypothesis that the covert focusing of spatial attention mediates the on-line maintenance of location information in spatial working memory. During the delay period of a spatial working-memory task, behaviorally irrelevant probe stimuli were flashed at both memorized and nonmemorized locations. Multichannel recordings of event-related potentials (ERPs) were used to assess visual processing of the probes at the different locations. Consistent with the hypothesis of attention-based rehearsal, early ERP components were enlarged in response to probes that appeared at memorized locations. These visual modulations were similar in latency and topography to those observed after explicit manipulations of spatial selective attention in a parallel experimental condition that employed an identical stimulus display.

  3. UTILIZING RESULTS FROM INSAR TO DEVELOP SEISMIC LOCATION BENCHMARKS AND IMPLICATIONS FOR SEISMIC SOURCE STUDIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. BEGNAUD; ET AL

    2000-09-01

    Obtaining accurate seismic event locations is one of the most important goals for monitoring detonations of underground nuclear teats. This is a particular challenge at small magnitudes where the number of recording stations may be less than 20. Although many different procedures are being developed to improve seismic location, most procedures suffer from inadequate testing against accurate information about a seismic event. Events with well-defined attributes, such as latitude, longitude, depth and origin time, are commonly referred to as ground truth (GT). Ground truth comes in many forms and with many different levels of accuracy. Interferometric Synthetic Aperture Radar (InSAR)more » can provide independent and accurate information (ground truth) regarding ground surface deformation and/or rupture. Relating surface deformation to seismic events is trivial when events are large and create a significant surface rupture, such as for the M{sub w} = 7.5 event that occurred in the remote northern region of the Tibetan plateau in 1997. The event, which was a vertical strike slip even appeared anomalous in nature due to the lack of large aftershocks and had an associated surface rupture of over 180 km that was identified and modeled using InSAR. The east-west orientation of the fault rupture provides excellent ground truth for latitude, but is of limited use for longitude. However, a secondary rupture occurred 50 km south of the main shock rupture trace that can provide ground truth with accuracy within 5 km. The smaller, 5-km-long secondary rupture presents a challenge for relating the deformation to a seismic event. The rupture is believed to have a thrust mechanism; the dip of the fimdt allows for some separation between the secondary rupture trace and its associated event epicenter, although not as much as is currently observed from catalog locations. Few events within the time period of the InSAR analysis are candidates for the secondary rupture. Of these, we

  4. Places, Spaces and Memory Traces: Showing Students with Learning Disabilities Ways to Remember Locations and Events on Maps.

    ERIC Educational Resources Information Center

    Brigham, Frederick J.

    This study examined the memory-enhancing effects of elaborative and mnemonic encoding of information presented with maps, compared to more traditional, non-mnemonic maps, on recall of locations of events and information associated with those events by 72 middle school students with learning disabilities. Subjects were presented with map-like…

  5. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2006-09-21

    validation purposes, we use GT0-2 event clusters. These include the Nevada Lop Nor, Semipalatinsk , and Novaya Zemlys test sites , as well as the Azgir...uncertainties. Furthermore, the tails of real seismic data distributions are heavier than Gaussian. The main objectives of this project are to develop, test

  6. Location of intense electromagnetic ion cyclotron (EMIC) wave events relative to the plasmapause: Van Allen Probes observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tetrick, S. S.; Engebretson, M. J.; Posch, J. L.

    In this paper, we have studied the spatial location relative to the plasmapause (PP) of the most intense electromagnetic ion cyclotron (EMIC) waves observed on Van Allen Probes A and B during their first full precession in local time. Most of these waves occurred over an L range of from -1 to +2 R E relative to the PP. Very few events occurred only within 0.1 R E of the PP, and events with a width in L of < 0.2 R E occurred both inside and outside the PP. Wave occurrence was always associated with high densities of ringmore » current ions; plasma density gradients or enhancements were associated with some events but were not dominant factors in determining the sites of wave generation. Storm main and recovery phase events in the dusk sector were often inside the PP, and dayside events during quiet times and compressions of the magnetosphere were more evenly distributed both inside and outside the PP. Superposed epoch analyses of the dependence of wave onset on solar wind dynamic pressure (Psw), the SME (SuperMAG auroral electrojet) index, and the SYM-H index showed that substorm injections and solar wind compressions were temporally closely associated with EMIC wave onset but to an extent that varied with frequency band, magnetic local time, and storm phase, and location relative to the PP. Finally, the fact that increases in SME and Psw were less strongly correlated with events at the PP than with other events might suggest that the occurrence of those events was affected by the density gradient.« less

  7. Location of intense electromagnetic ion cyclotron (EMIC) wave events relative to the plasmapause: Van Allen Probes observations

    DOE PAGES

    Tetrick, S. S.; Engebretson, M. J.; Posch, J. L.; ...

    2017-03-17

    In this paper, we have studied the spatial location relative to the plasmapause (PP) of the most intense electromagnetic ion cyclotron (EMIC) waves observed on Van Allen Probes A and B during their first full precession in local time. Most of these waves occurred over an L range of from -1 to +2 R E relative to the PP. Very few events occurred only within 0.1 R E of the PP, and events with a width in L of < 0.2 R E occurred both inside and outside the PP. Wave occurrence was always associated with high densities of ringmore » current ions; plasma density gradients or enhancements were associated with some events but were not dominant factors in determining the sites of wave generation. Storm main and recovery phase events in the dusk sector were often inside the PP, and dayside events during quiet times and compressions of the magnetosphere were more evenly distributed both inside and outside the PP. Superposed epoch analyses of the dependence of wave onset on solar wind dynamic pressure (Psw), the SME (SuperMAG auroral electrojet) index, and the SYM-H index showed that substorm injections and solar wind compressions were temporally closely associated with EMIC wave onset but to an extent that varied with frequency band, magnetic local time, and storm phase, and location relative to the PP. Finally, the fact that increases in SME and Psw were less strongly correlated with events at the PP than with other events might suggest that the occurrence of those events was affected by the density gradient.« less

  8. Short-Period Surface Wave Based Seismic Event Relocation

    NASA Astrophysics Data System (ADS)

    White-Gaynor, A.; Cleveland, M.; Nyblade, A.; Kintner, J. A.; Homman, K.; Ammon, C. J.

    2017-12-01

    Accurate and precise seismic event locations are essential for a broad range of geophysical investigations. Superior location accuracy generally requires calibration with ground truth information, but superb relative location precision is often achievable independently. In explosion seismology, low-yield explosion monitoring relies on near-source observations, which results in a limited number of observations that challenges our ability to estimate any locations. Incorporating more distant observations means relying on data with lower signal-to-noise ratios. For small, shallow events, the short-period (roughly 1/2 to 8 s period) fundamental-mode and higher-mode Rayleigh waves (including Rg) are often the most stable and visible portion of the waveform at local distances. Cleveland and Ammon [2013] have shown that teleseismic surface waves are valuable observations for constructing precise, relative event relocations. We extend the teleseismic surface wave relocation method, and apply them to near-source distances using Rg observations from the Bighorn Arche Seismic Experiment (BASE) and the Earth Scope USArray Transportable Array (TA) seismic stations. Specifically, we present relocation results using short-period fundamental- and higher-mode Rayleigh waves (Rg) in a double-difference relative event relocation for 45 delay-fired mine blasts and 21 borehole chemical explosions. Our preliminary efforts are to explore the sensitivity of the short-period surface waves to local geologic structure, source depth, explosion magnitude (yield), and explosion characteristics (single-shot vs. distributed source, etc.). Our results show that Rg and the first few higher-mode Rayleigh wave observations can be used to constrain the relative locations of shallow low-yield events.

  9. Location-based prospective memory.

    PubMed

    O'Rear, Andrea E; Radvansky, Gabriel A

    2018-02-01

    This study explores location-based prospective memory. People often have to remember to do things when in a particular location, such as buying tissues the next time they are in the supermarket. For event cognition theory, location is important for structuring events. However, because event cognition has not been used to examine prospective memory, the question remains of how multiple events will influence prospective memory performance. In our experiments, people delivered messages from store to store in a virtual shopping mall as an ongoing task. The prospective tasks were to do certain activities in certain stores. For Experiment 1, each trial involved one prospective memory task to be done in a single location at one of three delays. The virtual environment and location cues were effective for prospective memory, and performance was unaffected by delay. For Experiment 2, each trial involved two prospective memory tasks, given in either one or two instruction locations, and to be done in either one or two store locations. There was improved performance when people received instructions from two locations and did both tasks in one location relative to other combinations. This demonstrates that location-based event structure influences how well people perform on prospective memory tasks.

  10. Accurate relocation of seismicity along the North Aegean Trough and its relation to active tectonics

    NASA Astrophysics Data System (ADS)

    Konstantinou, K. I.

    2017-10-01

    The tectonics of northern Aegean are affected by the westward push of Anatolia and the gravitational spreading of the Aegean lithosphere that promote transtensional deformation in the area. This regime is also responsible for the creation of a series of pull-apart basins, collectively known as the North Aegean Trough. This work accurately relocates a total of 2300 earthquakes that were recorded along the North Aegean Trough during 2011-2016 by stations of the Hellenic Unified Seismic Network (HUSN) and strong-motion sensors. Absolute locations for these events were obtained using a nonlinear probabilistic algorithm and utilizing a minimum 1D velocity model with station corrections. The hypocentral depth distribution of these events shows a peak at 8 km diminishing gradually down to 20 km. A systematic overestimation of hypocentral depths is observed in the routine locations provided by the National Observatory of Athens where the majority of events appear to be deeper than 15 km. In order to obtain more accurate relative locations these events were relocated using the double-difference method. A total of 1693 events were finally relocated with horizontal and vertical uncertainties that do not exceed 0.11 km and 0.22 km respectively. Well-defined clusters of seismicity can be observed along the Saros and Sporades basins as well as the Kassandra and Sithonia peninsulas. These clusters either occur along the well-known NE-SW strike-slip faults bounding the basins, or along normal faults whose strike is perpendicular to the regional minimum stress axis. Locking depth along the North Aegean Trough is found to be remarkably stable between 13 and 17 km. This is likely a consequence of simultaneous reduction along the SW direction of heat flow (from 89 to 51 mW/m2) and strain rate (from 600 to 50 nstrain/yr) whose opposite effects are canceled out, precluding any sharp changes in locking depth.

  11. Disambiguating past events: Accurate source memory for time and context depends on different retrieval processes.

    PubMed

    Persson, Bjorn M; Ainge, James A; O'Connor, Akira R

    2016-07-01

    Current animal models of episodic memory are usually based on demonstrating integrated memory for what happened, where it happened, and when an event took place. These models aim to capture the testable features of the definition of human episodic memory which stresses the temporal component of the memory as a unique piece of source information that allows us to disambiguate one memory from another. Recently though, it has been suggested that a more accurate model of human episodic memory would include contextual rather than temporal source information, as humans' memory for time is relatively poor. Here, two experiments were carried out investigating human memory for temporal and contextual source information, along with the underlying dual process retrieval processes, using an immersive virtual environment paired with a 'Remember-Know' memory task. Experiment 1 (n=28) showed that contextual information could only be retrieved accurately using recollection, while temporal information could be retrieved using either recollection or familiarity. Experiment 2 (n=24), which used a more difficult task, resulting in reduced item recognition rates and therefore less potential for contamination by ceiling effects, replicated the pattern of results from Experiment 1. Dual process theory predicts that it should only be possible to retrieve source context from an event using recollection, and our results are consistent with this prediction. That temporal information can be retrieved using familiarity alone suggests that it may be incorrect to view temporal context as analogous to other typically used source contexts. This latter finding supports the alternative proposal that time since presentation may simply be reflected in the strength of memory trace at retrieval - a measure ideally suited to trace strength interrogation using familiarity, as is typically conceptualised within the dual process framework. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  13. Testing the Reviewed Event Bulletin of the International Data Centre Using Waveform Cross Correlation: Repeat Events at Aitik Copper Mine, Sweden

    NASA Astrophysics Data System (ADS)

    Kitov, I. O.; Rozhkov, N.; Bobrov, D.; Rozhkov, M.; Yedlin, M. J.

    2016-12-01

    The quality of the Reviewed Event Bulletin (REB) issued by the International Data Centre (IDC) of the Comprehensive Nuclear-Test- Ban Treaty Organization (CTBTO) is crucial for the Member States as well as for the seismological community. One of the most efficient methods to test the REB quality is using repeat events having very accurate absolute locations. Hundreds of quarry blasts detonated at Aitik copper mine (the central point of active mining - 67.08N, 20.95E) were recorded by several seismic arrays of the International Monitoring System (IMS), found by IDC automatic processing and then confirmed by analysts as REB events. The size of the quarry is approximately 1 km and one can consider that the uncertainty in absolute coordinates of the studied events is less than 0.5 km as measured from the central point. In the REB, the corresponding epicenters are almost uniformly scattered over the territory 67.0N to 67.3N, and 20.7E to 21.5E. These REB locations are based on the measured arrival times as well as azimuth and slowness estimates at several IMS stations with the main input from ARCES, NOA, FINES, and HFS. The higher scattering of REB locations is caused by the uncertainty in measurements and velocity model. Seismological methods based on waveform cross correlation allow very accurate relative location of repeat events. Here we test the level of similarity between signals from these events. It was found that IMS primary array station ARCES demonstrates the highest similarity as expressed by cross correlation coefficient (CC) and signal-to-noise ratio (SNR) calculated at the CC traces. Small-aperture array FINES is the second best and large-aperture array NOA demonstrating mediocre performance likely due its size and the loss of coherency between high-frequency and relatively low-velocity signals from the mine. During the last five years station ARCES has been upgraded from a vertical array to a 3-C one. This transformation has improved the performance of

  14. In situ, accurate, surface-enhanced Raman scattering detection of cancer cell nucleus with synchronous location by an alkyne-labeled biomolecular probe.

    PubMed

    Zhang, Jing; Liang, Lijia; Guan, Xin; Deng, Rong; Qu, Huixin; Huang, Dianshuai; Xu, Shuping; Liang, Chongyang; Xu, Weiqing

    2018-01-01

    A surface-enhanced Raman scattering (SERS) method for in situ detection and analysis of the intranuclear biomolecular information of a cell has been developed based on a small, biocompatible, nuclear-targeting alkyne-tagged deoxyribonucleic acid (DNA) probe (5-ethynyl-2'-deoxyuridine, EDU) that can specially accumulate in the cell nucleus during DNA replications to precisely locate the nuclear region without disturbance in cell biological activities and functions. Since the specific alkyne group shows a Raman peak in the Raman-silent region of cells, it is an interior label to visualize the nuclear location synchronously in real time when measuring the SERS spectra of a cell. Because no fluorescent-labeled dyes were used for locating cell nuclei, this method is simple, nondestructive, non- photobleaching, and valuable for the in situ exploration of vital physiological processes with DNA participation in cell organelles. Graphical abstract A universal strategy was developed to accurately locate the nuclear region and obtain precise molecular information of cell nuclei by SERS.

  15. An infrastructure for accurate characterization of single-event transients in digital circuits.

    PubMed

    Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael

    2013-11-01

    We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure.

  16. An infrastructure for accurate characterization of single-event transients in digital circuits☆

    PubMed Central

    Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael

    2013-01-01

    We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure. PMID:24748694

  17. Accurate Analysis of the Change in Volume, Location, and Shape of Metastatic Cervical Lymph Nodes During Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takao, Seishin, E-mail: takao@mech-me.eng.hokudai.ac.jp; Tadano, Shigeru; Taguchi, Hiroshi

    2011-11-01

    Purpose: To establish a method for the accurate acquisition and analysis of the variations in tumor volume, location, and three-dimensional (3D) shape of tumors during radiotherapy in the era of image-guided radiotherapy. Methods and Materials: Finite element models of lymph nodes were developed based on computed tomography (CT) images taken before the start of treatment and every week during the treatment period. A surface geometry map with a volumetric scale was adopted and used for the analysis. Six metastatic cervical lymph nodes, 3.5 to 55.1 cm{sup 3} before treatment, in 6 patients with head and neck carcinomas were analyzed inmore » this study. Three fiducial markers implanted in mouthpieces were used for the fusion of CT images. Changes in the location of the lymph nodes were measured on the basis of these fiducial markers. Results: The surface geometry maps showed convex regions in red and concave regions in blue to ensure that the characteristics of the 3D tumor geometries are simply understood visually. After the irradiation of 66 to 70 Gy in 2 Gy daily doses, the patterns of the colors had not changed significantly, and the maps before and during treatment were strongly correlated (average correlation coefficient was 0.808), suggesting that the tumors shrank uniformly, maintaining the original characteristics of the shapes in all 6 patients. The movement of the gravitational center of the lymph nodes during the treatment period was everywhere less than {+-}5 mm except in 1 patient, in whom the change reached nearly 10 mm. Conclusions: The surface geometry map was useful for an accurate evaluation of the changes in volume and 3D shapes of metastatic lymph nodes. The fusion of the initial and follow-up CT images based on fiducial markers enabled an analysis of changes in the location of the targets. Metastatic cervical lymph nodes in patients were suggested to decrease in size without significant changes in the 3D shape during radiotherapy. The

  18. Global and Regional 3D Tomography for Improved Seismic Event Location and Uncertainty in Explosion Monitoring

    NASA Astrophysics Data System (ADS)

    Downey, N.; Begnaud, M. L.; Hipp, J. R.; Ballard, S.; Young, C. S.; Encarnacao, A. V.

    2017-12-01

    The SALSA3D global 3D velocity model of the Earth was developed to improve the accuracy and precision of seismic travel time predictions for a wide suite of regional and teleseismic phases. Recently, the global SALSA3D model was updated to include additional body wave phases including mantle phases, core phases, reflections off the core-mantle boundary and underside reflections off the surface of the Earth. We show that this update improves travel time predictions and leads directly to significant improvements in the accuracy and precision of seismic event locations as compared to locations computed using standard 1D velocity models like ak135, or 2½D models like RSTT. A key feature of our inversions is that path-specific model uncertainty of travel time predictions are calculated using the full 3D model covariance matrix computed during tomography, which results in more realistic uncertainty ellipses that directly reflect tomographic data coverage. Application of this method can also be done at a regional scale: we present a velocity model with uncertainty obtained using data obtained from the University of Utah Seismograph Stations. These results show a reduction in travel-time residuals for re-located events compared with those obtained using previously published models.

  19. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2008-06-30

    throughout this study . The data set consists of GT0-2 nuclear explosions from the SAIC Nuclear Explosion Database (www.rdss.info, Bahavar et al...errors: Bias and variance In this study SNR dependence of both delay and variance of reading errors of first arriving P waves are analyzed and...ground truth and range of event size. For other datasets we turn to estimates based on double- differences between arrival times of station pairs

  20. Accurate modeling and inversion of electrical resistivity data in the presence of metallic infrastructure with known location and dimension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Timothy C.; Wellman, Dawn M.

    2015-06-26

    Electrical resistivity tomography (ERT) has been widely used in environmental applications to study processes associated with subsurface contaminants and contaminant remediation. Anthropogenic alterations in subsurface electrical conductivity associated with contamination often originate from highly industrialized areas with significant amounts of buried metallic infrastructure. The deleterious influence of such infrastructure on imaging results generally limits the utility of ERT where it might otherwise prove useful for subsurface investigation and monitoring. In this manuscript we present a method of accurately modeling the effects of buried conductive infrastructure within the forward modeling algorithm, thereby removing them from the inversion results. The method ismore » implemented in parallel using immersed interface boundary conditions, whereby the global solution is reconstructed from a series of well-conditioned partial solutions. Forward modeling accuracy is demonstrated by comparison with analytic solutions. Synthetic imaging examples are used to investigate imaging capabilities within a subsurface containing electrically conductive buried tanks, transfer piping, and well casing, using both well casings and vertical electrode arrays as current sources and potential measurement electrodes. Results show that, although accurate infrastructure modeling removes the dominating influence of buried metallic features, the presence of metallic infrastructure degrades imaging resolution compared to standard ERT imaging. However, accurate imaging results may be obtained if electrodes are appropriately located.« less

  1. The 2011 Eruption of Nabro Volcano (Eritrea): Earthquake Locations from a Temporary Broadband Network

    NASA Astrophysics Data System (ADS)

    Hamlyn, J.; Keir, D.; Hammond, J.; Wright, T.; Neuberg, J.; Kibreab, A.; Ogubazghi, G.; Goitom, B.

    2012-04-01

    Nabro volcano dominates the central part of the Nabro Volcanic Range (NVR), which trends SSW-NNE covering a stretch of 110 km from the SEE margin of the Afar depression to the Red Sea. Regionally, the NVR sits within the Afar triangle, the triple junction of the Somalian, Arabian and African plates. On 12th June 2011 Nabro volcano suddenly erupted after being inactive for 10, 000 years. In response, a network of 8 seismometers, were located around the active vent. The seismic signals detected by this array and those arriving at a regional seismic station (located to the north-west) were processed to provide accurate earthquake locations for the period August-October. Transects of the volcano were used to create cross sections to aid the interpretation. Typically, the majority of the seismic events are located at the active vent and on the flanks of Nabro, with fewer events dispersed around the surrounding area. However, there appears to be a smaller hub of events to the south-west of Nabro beneath the neighbouring Mallahle volcanic caldera (located on the Ethiopian side of the international border). This may imply some form of co-dependent relationship within the plumbing of the magma system beneath both calderas.

  2. Optimal filter parameters for low SNR seismograms as a function of station and event location

    NASA Astrophysics Data System (ADS)

    Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.

    1999-06-01

    Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.

  3. Travel-time source-specific station correction improves location accuracy

    NASA Astrophysics Data System (ADS)

    Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo

    2013-04-01

    Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.

  4. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb-Argument Information on Predictive Processing in Aphasia.

    PubMed

    Hayes, Rebecca A; Dickey, Michael Walsh; Warren, Tessa

    2016-12-01

    This study examined the influence of verb-argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54-82 years) as well as 44 young adults (aged 18-31 years) and 18 older adults (aged 50-71 years) participated. Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure.

  5. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb–Argument Information on Predictive Processing in Aphasia

    PubMed Central

    Dickey, Michael Walsh; Warren, Tessa

    2016-01-01

    Purpose This study examined the influence of verb–argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. Method This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54–82 years) as well as 44 young adults (aged 18–31 years) and 18 older adults (aged 50–71 years) participated. Results Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Conclusions Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure. PMID:27997951

  6. Multiple-Array Detection, Association and Location of Infrasound and Seismo-Acoustic Event-Utilization of Ground Truth Information (Postprint)

    DTIC Science & Technology

    2012-05-07

    AFRL-RV-PS- AFRL-RV-PS- TP-2012-0017 TP-2012-0017 MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC...ASSOCIATION AND LOCATION OF 5a. CONTRACT NUMBER FA8718-08-C-0008 INFRASOUND AND SEISMO-ACOUSTIC EVENT – UTILIZATION OF GROUND-TRUTH... infrasound signals from both correlated and uncorrelated noise. Approaches to this problem are implementation of the F-detector, which employs the F

  7. Helicopter magnetic survey conducted to locate wells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veloski, G.A.; Hammack, R.W.; Stamp, V.

    2008-07-01

    A helicopter magnetic survey was conducted in August 2007 over 15.6 sq mi at the Naval Petroleum Reserve No. 3’s (NPR-3) Teapot Dome Field near Casper, Wyoming. The survey’s purpose was to accurately locate wells drilled there during more than 90 years of continuous oilfield operation. The survey was conducted at low altitude and with closely spaced flight lines to improve the detection of wells with weak magnetic response and to increase the resolution of closely spaced wells. The survey was in preparation for a planned CO2 flood for EOR, which requires a complete well inventory with accurate locations formore » all existing wells. The magnetic survey was intended to locate wells missing from the well database and to provide accurate locations for all wells. The ability of the helicopter magnetic survey to accurately locate wells was accomplished by comparing airborne well picks with well locations from an intense ground search of a small test area.« less

  8. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  9. Accurate source location from waves scattered by surface topography: Applications to the Nevada and North Korean test sites

    NASA Astrophysics Data System (ADS)

    Shen, Y.; Wang, N.; Bao, X.; Flinders, A. F.

    2016-12-01

    Scattered waves generated near the source contains energy converted from the near-field waves to the far-field propagating waves, which can be used to achieve location accuracy beyond the diffraction limit. In this work, we apply a novel full-wave location method that combines a grid-search algorithm with the 3D Green's tensor database to locate the Non-Proliferation Experiment (NPE) at the Nevada test site and the North Korean nuclear tests. We use the first arrivals (Pn/Pg) and their immediate codas, which are likely dominated by waves scattered at the surface topography near the source, to determine the source location. We investigate seismograms in the frequency of [1.0 2.0] Hz to reduce noises in the data and highlight topography scattered waves. High resolution topographic models constructed from 10 and 90 m grids are used for Nevada and North Korea, respectively. The reference velocity model is based on CRUST 1.0. We use the collocated-grid finite difference method on curvilinear grids to calculate the strain Green's tensor and obtain synthetic waveforms using source-receiver reciprocity. The `best' solution is found based on the least-square misfit between the observed and synthetic waveforms. To suppress random noises, an optimal weighting method for three-component seismograms is applied in misfit calculation. Our results show that the scattered waves are crucial in improving resolution and allow us to obtain accurate solutions with a small number of stations. Since the scattered waves depends on topography, which is known at the wavelengths of regional seismic waves, our approach yields absolute, instead of relative, source locations. We compare our solutions with those of USGS and other studies. Moreover, we use differential waveforms to locate pairs of the North Korea tests from years 2006, 2009, 2013 and 2016 to further reduce the effects of unmodeled heterogeneities and errors in the reference velocity model.

  10. Selective attention to sound location or pitch studied with event-related brain potentials and magnetic fields.

    PubMed

    Degerman, Alexander; Rinne, Teemu; Särkkä, Anna-Kaisa; Salmi, Juha; Alho, Kimmo

    2008-06-01

    Event-related brain potentials (ERPs) and magnetic fields (ERFs) were used to compare brain activity associated with selective attention to sound location or pitch in humans. Sixteen healthy adults participated in the ERP experiment, and 11 adults in the ERF experiment. In different conditions, the participants focused their attention on a designated sound location or pitch, or pictures presented on a screen, in order to detect target sounds or pictures among the attended stimuli. In the Attend Location condition, the location of sounds varied randomly (left or right), while their pitch (high or low) was kept constant. In the Attend Pitch condition, sounds of varying pitch (high or low) were presented at a constant location (left or right). Consistent with previous ERP results, selective attention to either sound feature produced a negative difference (Nd) between ERPs to attended and unattended sounds. In addition, ERPs showed a more posterior scalp distribution for the location-related Nd than for the pitch-related Nd, suggesting partially different generators for these Nds. The ERF source analyses found no source distribution differences between the pitch-related Ndm (the magnetic counterpart of the Nd) and location-related Ndm in the superior temporal cortex (STC), where the main sources of the Ndm effects are thought to be located. Thus, the ERP scalp distribution differences between the location-related and pitch-related Nd effects may have been caused by activity of areas outside the STC, perhaps in the inferior parietal regions.

  11. Analysis of several methods and inertial sensors locations to assess gait parameters in able-bodied subjects.

    PubMed

    Ben Mansour, Khaireddine; Rezzoug, Nasser; Gorce, Philippe

    2015-10-01

    The purpose of this paper was to determine which types of inertial sensors and which advocated locations should be used for reliable and accurate gait event detection and temporal parameter assessment in normal adults. In addition, we aimed to remove the ambiguity found in the literature of the definition of the initial contact (IC) from the lumbar accelerometer. Acceleration and angular velocity data was gathered from the lumbar region and the distal edge of each shank. This data was evaluated in comparison to an instrumented treadmill and an optoelectronic system during five treadmill speed sessions. The lumbar accelerometer showed that the peak of the anteroposterior component was the most accurate for IC detection. Similarly, the valley that followed the peak of the vertical component was the most precise for terminal contact (TC) detection. Results based on ANOVA and Tukey tests showed that the set of inertial methods was suitable for temporal gait assessment and gait event detection in able-bodied subjects. For gait event detection, an exception was found with the shank accelerometer. The tool was suitable for temporal parameters assessment, despite the high root mean square error on the detection of IC (RMSEIC) and TC (RMSETC). The shank gyroscope was found to be as accurate as the kinematic method since the statistical tests revealed no significant difference between the two techniques for the RMSE off all gait events and temporal parameters. The lumbar and shank accelerometers were the most accurate alternative to the shank gyroscope for gait event detection and temporal parameters assessment, respectively. Copyright © 2015. Published by Elsevier B.V.

  12. Recurring OH Flares towards o Ceti - I. Location and structure of the 1990s' and 2010s' events

    NASA Astrophysics Data System (ADS)

    Etoka, S.; Gérard, E.; Richards, A. M. S.; Engels, D.; Brand, J.; Le Bertre, T.

    2017-06-01

    We present the analysis of the onset of the new 2010s OH flaring event detected in the OH ground-state main line at 1665 MHz towards o Ceti and compare its characteristics with those of the 1990s' flaring event. This is based on a series of complementary single-dish and interferometric observations both in OH and H2O obtained with the Nançay Radio telescope, the Medicina and Effelsberg Telescopes, the European VLBI Network and (e)Multi-Element Radio Linked Interferometer Network. We compare the overall characteristics of o Ceti's flaring events with those that have been observed towards other thin-shell Miras, and explore the implication of these events with respect to the standard OH circumstellar-envelope model. The role of binarity in the specific characteristics of o Ceti's flaring events is also investigated. The flaring regions are found to be less than ˜400 ± 40 mas (I.e. ≤40 ± 4 au) either side of o Ceti, with seemingly no preferential location with respect to the direction to the companion Mira B. Contrary to the usual expectation that the OH maser zone is located outside the H2O maser zone, the coincidence of the H2O and OH maser velocities suggests that both emissions arise at similar distances from the star. The OH flaring characteristics of Mira are similar to those observed in various Mira variables before, supporting the earlier results that the regions where the transient OH maser emission occurs are different from the standard OH maser zone.

  13. Memory for time and place contributes to enhanced confidence in memories for emotional events

    PubMed Central

    Rimmele, Ulrike; Davachi, Lila; Phelps, Elizabeth A.

    2012-01-01

    Emotion strengthens the subjective sense of remembering. However, these confidently remembered emotional memories have not been found be more accurate for some types of contextual details. We investigated whether the subjective sense of recollecting negative stimuli is coupled with enhanced memory accuracy for three specific types of central contextual details using the remember/know paradigm and confidence ratings. Our results indicate that the subjective sense of remembering is indeed coupled with better recollection of spatial location and temporal context. In contrast, we found a double-dissociation between the subjective sense of remembering and memory accuracy for colored dots placed in the conceptual center of negative and neutral scenes. These findings show that the enhanced subjective recollective experience for negative stimuli reliably indicates objective recollection for spatial location and temporal context, but not for other types of details, whereas for neutral stimuli, the subjective sense of remembering is coupled with all the types of details assessed. Translating this finding to flashbulb memories, we found that, over time, more participants correctly remembered the location where they learned about the terrorist attacks on 9/11 than any other canonical feature. Likewise participants’ confidence was higher in their memory for location vs. other canonical features. These findings indicate that the strong recollective experience of a negative event corresponds to an accurate memory for some kinds of contextual details, but not other kinds. This discrepancy provides further evidence that the subjective sense of remembering negative events is driven by a different mechanism than the subjective sense of remembering neutral events. PMID:22642353

  14. Social-aware Event Handling within the FallRisk Project.

    PubMed

    De Backere, Femke; Van den Bergh, Jan; Coppers, Sven; Elprama, Shirley; Nelis, Jelle; Verstichel, Stijn; Jacobs, An; Coninx, Karin; Ongenae, Femke; De Turck, Filip

    2017-01-09

    With the uprise of the Internet of Things, wearables and smartphones are moving to the foreground. Ambient Assisted Living solutions are, for example, created to facilitate ageing in place. One example of such systems are fall detection systems. Currently, there exists a wide variety of fall detection systems using different methodologies and technologies. However, these systems often do not take into account the fall handling process, which starts after a fall is identified or this process only consists of sending a notification. The FallRisk system delivers an accurate analysis of incidents occurring in the home of the older adults using several sensors and smart devices. Moreover, the input from these devices can be used to create a social-aware event handling process, which leads to assisting the older adult as soon as possible and in the best possible way. The FallRisk system consists of several components, located in different places. When an incident is identified by the FallRisk system, the event handling process will be followed to assess the fall incident and select the most appropriate caregiver, based on the input of the smartphones of the caregivers. In this process, availability and location are automatically taken into account. The event handling process was evaluated during a decision tree workshop to verify if the current day practices reflect the requirements of all the stakeholders. Other knowledge, which is uncovered during this workshop can be taken into account to further improve the process. The FallRisk offers a way to detect fall incidents in an accurate way and uses context information to assign the incident to the most appropriate caregiver. This way, the consequences of the fall are minimized and help is at location as fast as possible. It could be concluded that the current guidelines on fall handling reflect the needs of the stakeholders. However, current technology evolutions, such as the uptake of wearables and smartphones, enables

  15. Improved Location of Microseismic Events in Borehole Monitoring by Inclusion of Particle Motion Analysis: a Case Study at a CBM Field in Indonesia

    NASA Astrophysics Data System (ADS)

    Verdhora Ry, Rexha; Septyana, T.; Widiyantoro, S.; Nugraha, A. D.; Ardjuna, A.

    2017-04-01

    Microseismic monitoring and constraining its hypocenters in and around hydrocarbon reservoirs provides insight into induced deformation related to hydraulic fracturing. In this study, we used data from a single vertical array of sensors in a borehole, providing measures of arrival times and polarizations. Microseismic events are located using 1-D velocity models and arrival times of P- and S-waves. However, in the case of all the sensors being deployed in a near-vertical borehole, there is a high ambiguity in the source location. Herein, we applied a procedure using azimuth of P-wave particle motion to constrain and improve the source location. We used a dataset acquired during 1-day of fracture stimulation at a CBM field in Indonesia. We applied five steps of location procedure to investigate microseismic events induced by these hydraulic fracturing activities. First, arrival times for 1584 candidate events were manually picked. Then we refined the arrival times using energy ratio method to obtain high consistency picking. Using these arrival times, we estimated back-azimuth using P-wave polarization analysis. We also added the combination of polarities analysis to remove 180° ambiguity. In the end, we determined hypocenter locations using grid-search method that guided in the back-azimuth trace area to minimize the misfit function of arrival times. We have successfully removed the ambiguity and produced a good solution for hypocenter locations as indicated statistically by small RMS. Most of the events clusters highlight coherent structures around the treatment well site and revealed faults. The same procedure can be applied to various other cases such as microseismic monitoring in the field of geothermal and shale gas/oil exploration, also CCS (Carbon Capture and Storage) development.

  16. A data-based model to locate mass movements triggered by seismic events in Sichuan, China.

    PubMed

    de Souza, Fabio Teodoro

    2014-01-01

    Earthquakes affect the entire world and have catastrophic consequences. On May 12, 2008, an earthquake of magnitude 7.9 on the Richter scale occurred in the Wenchuan area of Sichuan province in China. This event, together with subsequent aftershocks, caused many avalanches, landslides, debris flows, collapses, and quake lakes and induced numerous unstable slopes. This work proposes a methodology that uses a data mining approach and geographic information systems to predict these mass movements based on their association with the main and aftershock epicenters, geologic faults, riverbeds, and topography. A dataset comprising 3,883 mass movements is analyzed, and some models to predict the location of these mass movements are developed. These predictive models could be used by the Chinese authorities as an important tool for identifying risk areas and rescuing survivors during similar events in the future.

  17. One dimensional P wave velocity structure of the crust beneath west Java and accurate hypocentre locations from local earthquake inversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Supardiyono; Santosa, Bagus Jaya; Physics Department, Faculty of Mathematics and Natural Sciences, Sepuluh Nopember Institute of Technology, Surabaya

    A one-dimensional (1-D) velocity model and station corrections for the West Java zone were computed by inverting P-wave arrival times recorded on a local seismic network of 14 stations. A total of 61 local events with a minimum of 6 P-phases, rms 0.56 s and a maximum gap of 299 Degree-Sign were selected. Comparison with previous earthquake locations shows an improvement for the relocated earthquakes. Tests were carried out to verify the robustness of inversion results in order to corroborate the conclusions drawn out from our reasearch. The obtained minimum 1-D velocity model can be used to improve routine earthquakemore » locations and represents a further step toward more detailed seismotectonic studies in this area of West Java.« less

  18. A robust statistical estimation (RoSE) algorithm jointly recovers the 3D location and intensity of single molecules accurately and precisely

    NASA Astrophysics Data System (ADS)

    Mazidi, Hesam; Nehorai, Arye; Lew, Matthew D.

    2018-02-01

    In single-molecule (SM) super-resolution microscopy, the complexity of a biological structure, high molecular density, and a low signal-to-background ratio (SBR) may lead to imaging artifacts without a robust localization algorithm. Moreover, engineered point spread functions (PSFs) for 3D imaging pose difficulties due to their intricate features. We develop a Robust Statistical Estimation algorithm, called RoSE, that enables joint estimation of the 3D location and photon counts of SMs accurately and precisely using various PSFs under conditions of high molecular density and low SBR.

  19. A Bayesian framework for infrasound location

    NASA Astrophysics Data System (ADS)

    Modrak, Ryan T.; Arrowsmith, Stephen J.; Anderson, Dale N.

    2010-04-01

    We develop a framework for location of infrasound events using backazimuth and infrasonic arrival times from multiple arrays. Bayesian infrasonic source location (BISL) developed here estimates event location and associated credibility regions. BISL accounts for unknown source-to-array path or phase by formulating infrasonic group velocity as random. Differences between observed and predicted source-to-array traveltimes are partitioned into two additive Gaussian sources, measurement error and model error, the second of which accounts for the unknown influence of wind and temperature on path. By applying the technique to both synthetic tests and ground-truth events, we highlight the complementary nature of back azimuths and arrival times for estimating well-constrained event locations. BISL is an extension to methods developed earlier by Arrowsmith et al. that provided simple bounds on location using a grid-search technique.

  20. Sources of Infrasound events listed in IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick

    2017-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.

  1. Seismic Monitoring of Ice Generated Events at the Bering Glacier

    NASA Astrophysics Data System (ADS)

    Fitzgerald, K.; Richardson, J.; Pennington, W.

    2008-12-01

    The Bering Glacier, located in southeast Alaska, is the largest glacier in North America with a surface area of approximately 5,175 square kilometers. It extends from its source in the Bagley Icefield to its terminus in tidal Vitus Lake, which drains into the Gulf of Alaska. It is known that the glacier progresses downhill through the mechanisms of plastic crystal deformation and basal sliding. However, the basal processes which take place tens to hundreds of meters below the surface are not well understood, except through the study of sub- glacial landforms and passive seismology. Additionally, the sub-glacial processes enabling the surges, which occur approximately every two decades, are poorly understood. Two summer field campaigns in 2007 and 2008 were designed to investigate this process near the terminus of the glacier. During the summer of 2007, a field experiment at the Bering Glacier was conducted using a sparse array of L-22 short period sensors to monitor ice-related events. The array was in place for slightly over a week in August and consisted of five stations centered about the final turn of the glacier west of the Grindle Hills. Many events were observed, but due to the large distance between stations and the highly attenuating surface ice, few events were large enough to be recorded on sufficient stations to be accurately located and described. During August 2008, six stations were deployed for a similar length of time, but with a closer spacing. With this improved array, events were located and described more accurately, leading to additional conclusions about the surface, interior, and sub-glacial ice processes producing seismic signals. While the glacier was not surging during the experiment, this study may provide information on the non-surging, sub-glacial base level activity. It is generally expected that another surge will take place within a few years, and baseline studies such as this may assist in understanding the nature of surges.

  2. Regional Seismic Travel-Time Prediction, Uncertainty, and Location Improvement in Western Eurasia

    NASA Astrophysics Data System (ADS)

    Flanagan, M. P.; Myers, S. C.

    2004-12-01

    sample WENA1.0 and therefore provide an unbiased assessment of location performance. A statistically significant sample is achieved by generating 500 location realizations based on 5 events with location accuracy between 1 km and 5 km. Each realization is a randomly selected event with location determined by randomly selecting 5 stations from the available network. In 340 cases (68% of the instances), locations are improved, and average mislocation is reduced from 31 km to 26 km. Preliminary test of uncertainty estimates suggest that our uncertainty model produces location uncertainty ellipses that are representative of location accuracy. These results highlight the importance of accurate GT datasets in assessing regional travel-time models and demonstrate that an a priori 3D model can markedly improve our ability to locate small magnitude events in a regional monitoring context. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-CONF-206386.

  3. Location negative priming effects in children with developmental dyslexia: An event-related potential study.

    PubMed

    Ma, Yujun; Wang, Enguo; Yuan, Tian; Zhao, Guo Xiang

    2016-08-01

    As the reading process is inseparable from working memory, inhibition, and other higher cognitive processes, the deep cognitive processing defects that are associated with dyslexia may be due to defective distraction inhibition systems. In this study, we used event-related potential technology to explore the source of negative priming effects in children with developmental dyslexia and in a group of healthy children for comparison. We found that the changes in the average response times in the negative priming and control conditions were consistent across the two groups, while the negative priming effects differed significantly between the groups. The magnitude of the negative priming effect was significantly different between the two groups, with the magnitude being significantly higher in the control group than it was in the developmental dyslexia group. These results indicate that there are deficits in distraction inhibition in children with developmental dyslexia. In terms of the time course of processing, inhibition deficits in the dyslexia group appeared during early-stage cognition selection and lasted through the response selection phase. Regarding the cerebral cortex locations, early-stage cognition selection was mainly located in the parietal region, while late-stage response selection was mainly located in the frontal and central regions. The results of our study may help further our understanding of the intrinsic causes of developmental dyslexia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Pyroclastic density current hazard maps at Campi Flegrei caldera (Italy): the effects of event scale, vent location and time forecasts.

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina

    2016-04-01

    Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.

  5. Nuclear event time histories and computed site transfer functions for locations in the Los Angeles region

    USGS Publications Warehouse

    Rogers, A.M.; Covington, P.A.; Park, R.B.; Borcherdt, R.D.; Perkins, D.M.

    1980-01-01

    This report presents a collection of Nevada Test Site (NTS) nuclear explosion recordings obtained at sites in the greater Los Angeles, Calif., region. The report includes ground velocity time histories, as well as, derived site transfer functions. These data have been collected as part of a study to evaluate the validity of using low-level ground motions to predict the frequency-dependent response of a site during an earthquake. For this study 19 nuclear events were recorded at 98 separate locations. Some of these sites have recorded more than one of the nuclear explosions, and, consequently, there are a total of 159, three-component station records. The location of all the recording sites are shown in figures 1–5, the station coordinates and abbreviations are given in table 1. The station addresses are listed in table 2, and the nuclear explosions that were recorded are listed in table 3. The recording sites were chosen on the basis of three criteria: (1) that the underlying geological conditions were representative of conditions over significant areas of the region, (2) that the site was the location of a strong-motion recording of the 1971 San Fernando earthquake, or (3) that more complete geographical coverage was required in that location.

  6. Three-dimensional Probabilistic Earthquake Location Applied to 2002-2003 Mt. Etna Eruption

    NASA Astrophysics Data System (ADS)

    Mostaccio, A.; Tuve', T.; Zuccarello, L.; Patane', D.; Saccorotti, G.; D'Agostino, M.

    2005-12-01

    Recorded seismicity for the Mt. Etna volcano, occurred during the 2002-2003 eruption, has been relocated using a probabilistic, non-linear, earthquake location approach. We used the software package NonLinLoc (Lomax et al., 2000) adopting the 3D velocity model obtained by Cocina et al., 2005. We applied our data through different algorithms: (1) via a grid-search; (2) via a Metropolis-Gibbs; and (3) via an Oct-tree. The Oct-Tree algorithm gives efficient, faster and accurate mapping of the PDF (Probability Density Function) of the earthquake location problem. More than 300 seismic events were analyzed in order to compare non-linear location results with the ones obtained by using traditional, linearized earthquake location algorithm such as Hypoellipse, and a 3D linearized inversion (Thurber, 1983). Moreover, we compare 38 focal mechanisms, chosen following stricta criteria selection, with the ones obtained by the 3D and 1D results. Although the presented approach is more of a traditional relocation application, probabilistic earthquake location could be used in routinely survey.

  7. Separation of Benign and Malicious Network Events for Accurate Malware Family Classification

    DTIC Science & Technology

    2015-09-28

    use Kullback - Leibler (KL) divergence [15] to measure the information ...related work in an important aspect concerning the order of events. We use n-grams to capture the order of events, which exposes richer information about...DISCUSSION Using n-grams on higher level network events helps under- stand the underlying operation of the malware, and provides a good feature set

  8. An automatic procedure for high-resolution earthquake locations: a case study from the TABOO near fault observatory (Northern Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Valoroso, Luisa; Chiaraluce, Lauro; Di Stefano, Raffaele; Latorre, Diana; Piccinini, Davide

    2014-05-01

    The characterization of the geometry, kinematics and rheology of fault zones by seismological data depends on our capability of accurately locate the largest number of low-magnitude seismic events. To this aim, we have been working for the past three years to develop an advanced modular earthquake location procedure able to automatically retrieve high-resolution earthquakes catalogues directly from continuous waveforms data. We use seismograms recorded at about 60 seismic stations located both at surface and at depth. The network covers an area of about 80x60 km with a mean inter-station distance of 6 km. These stations are part of a Near fault Observatory (TABOO; http://taboo.rm.ingv.it/), consisting of multi-sensor stations (seismic, geodetic, geochemical and electromagnetic). This permanent scientific infrastructure managed by the INGV is devoted to studying the earthquakes preparatory phase and the fast/slow (i.e., seismic/aseismic) deformation process active along the Alto Tiberina fault (ATF) located in the northern Apennines (Italy). The ATF is potentially one of the rare worldwide examples of active low-angle (< 15°) normal fault accommodating crustal extension and characterized by a regular occurrence of micro-earthquakes. The modular procedure combines: i) a sensitive detection algorithm optimized to declare low-magnitude events; ii) an accurate picking procedure that provides consistently weighted P- and S-wave arrival times, P-wave first motion polarities and the maximum waveform amplitude for local magnitude calculation; iii) both linearized iterative and non-linear global-search earthquake location algorithms to compute accurate absolute locations of single-events in a 3D geological model (see Latorre et al. same session); iv) cross-correlation and double-difference location methods to compute high-resolution relative event locations. This procedure is now running off-line with a delay of 1 week to the real-time. We are now implementing this

  9. Location of EMIC Wave Events Relative to the Plasmapause: Van Allen Probes Observations

    NASA Astrophysics Data System (ADS)

    Tetrick, S.; Engebretson, M. J.; Posch, J. L.; Kletzing, C.; Smith, C. W.; Wygant, J. R.; Gkioulidou, M.; Reeves, G. D.; Fennell, J. F.

    2015-12-01

    Many early theoretical studies of electromagnetic ion cyclotron (EMIC) waves generated in Earth's magnetosphere predicted that the equatorial plasmapause (PP) would be a preferred location for their generation. However, several large statistical studies in the past two decades, most notably Fraser and Nguyen [2001], have provided little support for this location. In this study we present a survey of the most intense EMIC waves observed by the EMFISIS fluxgate magnetometer on the Van Allen Probes-A spacecraft (with apogee at 5.9 RE) from its launch through the end of 2014, and have compared their location with simultaneous electron density data obtained by the EFW electric field instrument and ring current ion flux data obtained by the HOPE and RBSPICE instruments. We show distributions of these waves as a function of distance inside or outside the PP as a function of local time sector, frequency band (H+, He+, or both), and timing relative to magnetic storms and substorms. Most EMIC waves in this data set occurred within 1 RE of the PP in all local time sectors, but very few were limited to ± 0.1 RE, and most of these occurred in the 06-12 MLT sector during non-storm conditions. The majority of storm main phase waves in the dusk sector occurred inside the PP. He+ band waves dominated at most local times inside the PP, and H+ band waves were never observed there. Although the presence of elevated fluxes of ring current protons was common to all events, the configuration of lower energy ion populations varied as a function of geomagnetic activity and storm phase.

  10. Location of acoustic emission sources generated by air flow

    PubMed

    Kosel; Grabec; Muzic

    2000-03-01

    The location of continuous acoustic emission sources is a difficult problem of non-destructive testing. This article describes one-dimensional location of continuous acoustic emission sources by using an intelligent locator. The intelligent locator solves a location problem based on learning from examples. To verify whether continuous acoustic emission caused by leakage air flow can be located accurately by the intelligent locator, an experiment on a thin aluminum band was performed. Results show that it is possible to determine an accurate location by using a combination of a cross-correlation function with an appropriate bandpass filter. By using this combination, discrete and continuous acoustic emission sources can be located by using discrete acoustic emission sources for locator learning.

  11. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less

  12. The 2012 Emilia seismic sequence (Northern Italy): Imaging the thrust fault system by accurate aftershock location

    NASA Astrophysics Data System (ADS)

    Govoni, Aladino; Marchetti, Alessandro; De Gori, Pasquale; Di Bona, Massimo; Lucente, Francesco Pio; Improta, Luigi; Chiarabba, Claudio; Nardi, Anna; Margheriti, Lucia; Agostinetti, Nicola Piana; Di Giovambattista, Rita; Latorre, Diana; Anselmi, Mario; Ciaccio, Maria Grazia; Moretti, Milena; Castellano, Corrado; Piccinini, Davide

    2014-05-01

    Starting from late May 2012, the Emilia region (Northern Italy) was severely shaken by an intense seismic sequence, originated from a ML 5.9 earthquake on May 20th, at a hypocentral depth of 6.3 km, with thrust-type focal mechanism. In the following days, the seismic rate remained high, counting 50 ML ≥ 2.0 earthquakes a day, on average. Seismicity spreads along a 30 km east-west elongated area, in the Po river alluvial plain, in the nearby of the cities Ferrara and Modena. Nine days after the first shock, another destructive thrust-type earthquake (ML 5.8) hit the area to the west, causing further damage and fatalities. Aftershocks following this second destructive event extended along the same east-westerly trend for further 20 km to the west, thus illuminating an area of about 50 km in length, on the whole. After the first shock struck, on May 20th, a dense network of temporary seismic stations, in addition to the permanent ones, was deployed in the meizoseismal area, leading to a sensible improvement of the earthquake monitoring capability there. A combined dataset, including three-component seismic waveforms recorded by both permanent and temporary stations, has been analyzed in order to obtain an appropriate 1-D velocity model for earthquake location in the study area. Here we describe the main seismological characteristics of this seismic sequence and, relying on refined earthquakes location, we make inferences on the geometry of the thrust system responsible for the two strongest shocks.

  13. aPPRove: An HMM-Based Method for Accurate Prediction of RNA-Pentatricopeptide Repeat Protein Binding Events

    PubMed Central

    Harrison, Thomas; Ruiz, Jaime; Sloan, Daniel B.; Ben-Hur, Asa; Boucher, Christina

    2016-01-01

    Pentatricopeptide repeat containing proteins (PPRs) bind to RNA transcripts originating from mitochondria and plastids. There are two classes of PPR proteins. The P class contains tandem P-type motif sequences, and the PLS class contains alternating P, L and S type sequences. In this paper, we describe a novel tool that predicts PPR-RNA interaction; specifically, our method, which we call aPPRove, determines where and how a PLS-class PPR protein will bind to RNA when given a PPR and one or more RNA transcripts by using a combinatorial binding code for site specificity proposed by Barkan et al. Our results demonstrate that aPPRove successfully locates how and where a PPR protein belonging to the PLS class can bind to RNA. For each binding event it outputs the binding site, the amino-acid-nucleotide interaction, and its statistical significance. Furthermore, we show that our method can be used to predict binding events for PLS-class proteins using a known edit site and the statistical significance of aligning the PPR protein to that site. In particular, we use our method to make a conjecture regarding an interaction between CLB19 and the second intronic region of ycf3. The aPPRove web server can be found at www.cs.colostate.edu/~approve. PMID:27560805

  14. Full waveform approach for the automatic detection and location of acoustic emissions from hydraulic fracturing at Äspö (Sweden)

    NASA Astrophysics Data System (ADS)

    Ángel López Comino, José; Cesca, Simone; Heimann, Sebastian; Grigoli, Francesco; Milkereit, Claus; Dahm, Torsten; Zang, Arno

    2017-04-01

    A crucial issue to analyse the induced seismicity for hydraulic fracturing is the detection and location of massive microseismic or acoustic emissions (AE) activity, with robust and sufficiently accurate automatic algorithms. Waveform stacking and coherence analysis have been tested for local seismic monitoring and mining induced seismicity improving the classical detection and location methods (e.g. short-term-average/long-term-average and automatic picking of the P and S waves first arrivals). These techniques are here applied using a full waveform approach for a hydraulic fracturing experiment (Nova project 54-14-1) that took place 410 m below surface in the Äspö Hard Rock Laboratory (Sweden). Continuous waveform recording with a near field network composed by eleven AE sensors are processed. The piezoelectric sensors have their highest sensitive in the frequency range 1 to 100 kHz, but sampling rates were extended to 1 MHz. We present the results obtained during the conventional, continuous water-injection experiment HF2 (Hydraulic Fracture 2). The event detector is based on the stacking of characteristic functions. It follows a delay-and-stack approach, where the likelihood of the hypocenter location in a pre-selected seismogenic volume is mapped by assessing the coherence of the P onset times at different stations. A low detector threshold is chosen, in order not to loose weaker events. This approach also increases the number of false detections. Therefore, the dataset has been revised manually, and detected events classified in terms of true AE events related to the fracturing process, electronic noise related to 50 Hz overtones, long period and other signals. The location of the AE events is further refined using a more accurate waveform stacking method which uses both P and S phases. A 3D grid is generated around the hydraulic fracturing volume and we retrieve a multidimensional matrix, whose absolute maximum corresponds to the spatial coordinates of the

  15. 76 FR 69613 - Special Local Regulations and Safety Zones; Recurring Events in Captain of the Port New York Zone

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-09

    ... regulatory text: Safety zones listed in Table 1 to Sec. 165.160 have been reordered and renumbered to more accurately reflect their geographical locations. In the published NPRM regulatory text, the Rumson, NJ... included in this recurring events regulation. In the regulatory text under Sec. 165.160(a), the existing...

  16. Are pain location and physical examinations useful in locating a tear site of the rotator cuff?

    PubMed

    Itoi, Eiji; Minagawa, Hiroshi; Yamamoto, Nobuyuki; Seki, Nobutoshi; Abe, Hidekazu

    2006-02-01

    Pain is the most common symptom of patients with rotator cuff tendinopathy, but little is known about the relationship between the site of pain and the site of cuff pathologic lesions. Also, accuracies of physical examinations used to locate a tear by assessing the muscle strength seem to be affected by the threshold for muscle weakness, but no studies have been reported regarding the efficacies of physical examinations in reference to their threshold. Pain location is useful in locating a tear site. Efficacies of physical examinations to evaluate the function of the cuff muscles depend on the threshold for muscle weakness. Case series; Level of evidence, 4. The authors retrospectively reviewed the clinical charts of 160 shoulders of 149 patients (mean age, 53 years) with either rotator cuff tears (140 shoulders) or cuff tendinitis (20 shoulders). The location of pain was recorded on a standardized form with 6 different areas. The diagnostic accuracies of the following tests were assessed with various thresholds for muscle weakness: supraspinatus test, the external rotation strength test, and the lift-off test. Lateral and anterior portions of the shoulder were the most common sites of pain regardless of existence of tear or tear location. The supraspinatus test was most accurate when it was assessed to have positive results with the muscle strength less than manual muscle testing grade 5, whereas the lift-off test was most accurate with a threshold less than grade 3. The external rotation strength test was most accurate with a threshold of less than grade 4+. The authors conclude that pain location is not useful in locating the site of a tear, whereas the physical examinations aiming to locate the tear site are clinically useful when assessed to have positive results with appropriate threshold for muscle weakness.

  17. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  18. When the Sky Falls: Performing Initial Assessments of Bright Atmospheric Events

    NASA Technical Reports Server (NTRS)

    Cooke, William J.; Brown, Peter; Blaauw, Rhiannon; Kingery, Aaron; Moser, Danielle

    2015-01-01

    The 2013 Chelyabinsk super bolide was the first "significant" impact event to occur in the age of social media and 24 hour news. Scientists, used to taking many days or weeks to analyze fireball events, were hard pressed to meet the immediate demands (within hours) for answers from the media, general public, and government officials. Fulfilling these requests forced many researchers to exploit information available from various Internet sources - videos were downloaded from sites like Youtube, geolocated via Google Street View, and quickly analyzed with improvised software; Twitter and Facebook were scoured for eyewitness accounts of the fireball and reports of meteorites. These data, combined with infrasound analyses, enabled a fairly accurate description of the Chelyabinsk event to be formed within a few hours; in particular, any relationship to 2012 DA14 (which passed near Earth later that same day) was eliminated. Results of these analyses were quickly disseminated to members of the NEO community for press conferences and media interviews. Despite a few minor glitches, the rapid initial assessment of Chelyabinsk was a triumph, permitting the timely conveyance of accurate information to the public and the incorporation of social media into fireball analyses. Beginning in 2008, the NASA Meteoroid Environments Office, working in cooperation with Western's Meteor Physics Group, developed processes and software that permit quick characterization - mass, trajectory, and orbital properties - of fireball events. These tools include automated monitoring of Twitter to establish the time of events (the first tweet is usually no more than a few seconds after the fireball), mining of Youtube and all sky camera web archives to locate videos suitable for analyses, use of Google Earth and Street View to geolocate the video locations, and software to determine the fireball trajectory and object orbital parameters, including generation of animations suitable for popular media

  19. Seismicity Along the Endeavour Segment of the Juan de Fuca Ridge: Automated Event Locations for an Ocean-Bottom Seismometer Network

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Wilcock, W. S.; Hooft, E. E.; Toomey, D. R.; McGill, P. R.

    2004-12-01

    From 2003-2006, the W.M. Keck Foundation supported the operation of a network of eight ocean-bottom seismometers (OBSs) that were deployed with a remotely operated vehicle along the central portion of the Endeavour Segment of the Juan de Fuca mid-ocean ridge as part of a multidisciplinary prototype NEPTUNE experiment. Data from 2003-2004 were initially analyzed during a research apprenticeship class at the University of Washington's Friday Harbor Laboratories. Eight student analysts located ~13,000 earthquakes along the Endeavour Segment. Analysis of data from 2004-2005 has to date been limited to locating ~6,000 earthquakes associated with a swarm in February-March 2005 near the northern end of the Endeavour Segment. The remaining data includes several significant swarms and it is anticipated that tens of thousands of earthquakes still need to be located. In order to efficiently obtain a complete catalog of high-quality locations for the 3-year experiment, we are developing an automatic method for earthquake location. We first apply a 5-Hz high-pass filter and identify triggers when the ratio of the root-mean square (RMS) amplitudes in short- and long- term windows exceeds a specified threshold. We search for events that are characterized by triggers within a short time interval on the majority of stations and use the signal spectra to eliminate events that are the result of 20-Hz Fin and Blue whale vocalizations. An autoregressive technique is applied to a short time window centered on the trigger time to pick P-wave times on each station's vertical channel. We locate the earthquake with these picks and either attempt to repick or eliminate arrivals with unacceptable residuals. Preliminary S-wave picks are then made on the horizontal channels by applying a 5-12 Hz bandpass filter, identifying the peak RMS amplitude for a short running window, and making a pick at the time the RMS amplitude rises above 50% of this value. The picks are refined using the

  20. Seismicity Along the Endeavour Segment of the Juan de Fuca Ridge: Automated Event Locations for an Ocean-Bottom Seismometer Network

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Wilcock, W. S.; Hooft, E. E.; Toomey, D. R.; McGill, P. R.

    2007-12-01

    From 2003-2006, the W.M. Keck Foundation supported the operation of a network of eight ocean-bottom seismometers (OBSs) that were deployed with a remotely operated vehicle along the central portion of the Endeavour Segment of the Juan de Fuca mid-ocean ridge as part of a multidisciplinary prototype NEPTUNE experiment. Data from 2003-2004 were initially analyzed during a research apprenticeship class at the University of Washington's Friday Harbor Laboratories. Eight student analysts located ~13,000 earthquakes along the Endeavour Segment. Analysis of data from 2004-2005 has to date been limited to locating ~6,000 earthquakes associated with a swarm in February-March 2005 near the northern end of the Endeavour Segment. The remaining data includes several significant swarms and it is anticipated that tens of thousands of earthquakes still need to be located. In order to efficiently obtain a complete catalog of high-quality locations for the 3-year experiment, we are developing an automatic method for earthquake location. We first apply a 5-Hz high-pass filter and identify triggers when the ratio of the root-mean square (RMS) amplitudes in short- and long- term windows exceeds a specified threshold. We search for events that are characterized by triggers within a short time interval on the majority of stations and use the signal spectra to eliminate events that are the result of 20-Hz Fin and Blue whale vocalizations. An autoregressive technique is applied to a short time window centered on the trigger time to pick P-wave times on each station's vertical channel. We locate the earthquake with these picks and either attempt to repick or eliminate arrivals with unacceptable residuals. Preliminary S-wave picks are then made on the horizontal channels by applying a 5-12 Hz bandpass filter, identifying the peak RMS amplitude for a short running window, and making a pick at the time the RMS amplitude rises above 50% of this value. The picks are refined using the

  1. Earthquake location in transversely isotropic media with a tilted symmetry axis

    NASA Astrophysics Data System (ADS)

    Zhao, Aihua; Ding, Zhifeng

    2009-04-01

    The conventional intersection method for earthquake location in isotropic media is developed in the case of transversely isotropic media with a tilted symmetry axis (TTI media). The hypocenter is determined using its loci, which are calculated through a minimum travel time tree algorithm for ray tracing in TTI media. There are no restrictions on the structural complexity of the model or on the anisotropy strength of the medium. The location method is validated by its application to determine the hypocenter and origin time of an event in a complex TTI structure, in accordance with four hypotheses or study cases: (a) accurate model and arrival times, (b) perturbed model with randomly variable elastic parameter, (c) noisy arrival time data, and (d) incomplete set of observations from the seismic stations. Furthermore, several numerical tests demonstrate that the orientation of the symmetry axis has a significant effect on the hypocenter location when the seismic anisotropy is not very weak. Moreover, if the hypocentral determination is based on an isotropic reference model while the real medium is anisotropic, the resultant location errors can be considerable even though the anisotropy strength does not exceed 6.10%.

  2. Experimental characterization of seasonal variations in infrasonic traveltimes on the Korean Peninsula with implications for infrasound event location

    NASA Astrophysics Data System (ADS)

    Che, Il-Young; Stump, Brian W.; Lee, Hee-Il

    2011-04-01

    The dependence of infrasound propagation on the season and path environment was quantified by the analysis of more than 1000 repetitive infrasonic ground-truth events at an active, open-pit mine over two years. Blast-associated infrasonic signals were analysed from two infrasound arrays (CHNAR and ULDAR) located at similar distances of 181 and 169 km, respectively, from the source but in different azimuthal directions and with different path environments. The CHNAR array is located to the NW of the source area with primarily a continental path, whereas the ULDAR is located East of the source with a path dominated by open ocean. As a result, CHNAR observations were dominated by stratospheric phases with characteristic celerities of 260-289 m s-1 and large seasonal variations in the traveltime, whereas data from ULDAR consisted primarily of tropospheric phases with larger celerities from 322 to 361 m s-1 and larger daily than seasonal variation in the traveltime. The interpretation of these observations is verified by ray tracing using atmospheric models incorporating daily weather balloon data that characterizes the shallow atmosphere for the two years of the study. Finally, experimental celerity models that included seasonal path effects were constructed from the long-term data set. These experimental celerity models were used to constrain traveltime variations in infrasonic location algorithms providing improved location estimates as illustrated with the empirical data set.

  3. Extracting Time-Accurate Acceleration Vectors From Nontrivial Accelerometer Arrangements.

    PubMed

    Franck, Jennifer A; Blume, Janet; Crisco, Joseph J; Franck, Christian

    2015-09-01

    Sports-related concussions are of significant concern in many impact sports, and their detection relies on accurate measurements of the head kinematics during impact. Among the most prevalent recording technologies are videography, and more recently, the use of single-axis accelerometers mounted in a helmet, such as the HIT system. Successful extraction of the linear and angular impact accelerations depends on an accurate analysis methodology governed by the equations of motion. Current algorithms are able to estimate the magnitude of acceleration and hit location, but make assumptions about the hit orientation and are often limited in the position and/or orientation of the accelerometers. The newly formulated algorithm presented in this manuscript accurately extracts the full linear and rotational acceleration vectors from a broad arrangement of six single-axis accelerometers directly from the governing set of kinematic equations. The new formulation linearizes the nonlinear centripetal acceleration term with a finite-difference approximation and provides a fast and accurate solution for all six components of acceleration over long time periods (>250 ms). The approximation of the nonlinear centripetal acceleration term provides an accurate computation of the rotational velocity as a function of time and allows for reconstruction of a multiple-impact signal. Furthermore, the algorithm determines the impact location and orientation and can distinguish between glancing, high rotational velocity impacts, or direct impacts through the center of mass. Results are shown for ten simulated impact locations on a headform geometry computed with three different accelerometer configurations in varying degrees of signal noise. Since the algorithm does not require simplifications of the actual impacted geometry, the impact vector, or a specific arrangement of accelerometer orientations, it can be easily applied to many impact investigations in which accurate kinematics need

  4. Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events.

    PubMed

    Stekelenburg, Jeroen J; Vroomen, Jean

    2012-01-01

    In many natural audiovisual events (e.g., a clap of the two hands), the visual signal precedes the sound and thus allows observers to predict when, where, and which sound will occur. Previous studies have reported that there are distinct neural correlates of temporal (when) versus phonetic/semantic (which) content on audiovisual integration. Here we examined the effect of visual prediction of auditory location (where) in audiovisual biological motion stimuli by varying the spatial congruency between the auditory and visual parts. Visual stimuli were presented centrally, whereas auditory stimuli were presented either centrally or at 90° azimuth. Typical sub-additive amplitude reductions (AV - V < A) were found for the auditory N1 and P2 for spatially congruent and incongruent conditions. The new finding is that this N1 suppression was greater for the spatially congruent stimuli. A very early audiovisual interaction was also found at 40-60 ms (P50) in the spatially congruent condition, while no effect of congruency was found on the suppression of the P2. This indicates that visual prediction of auditory location can be coded very early in auditory processing.

  5. An iterative matching and locating technique for borehole microseismic monitoring

    NASA Astrophysics Data System (ADS)

    Chen, H.; Meng, X.; Niu, F.; Tang, Y.

    2016-12-01

    Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. The success of hydraulic fracturing monitoring relies on the detection and characterization (i.e., location and focal mechanism estimation) of a maximum number of induced microseismic events. All the events are important to quantify the stimulated reservior volume (SRV) and characterize the newly created fracture network. Detecting and locating low magnitude events, however, are notoriously difficult, particularly at a high noisy production environment. Here we propose an iterative matching and locating technique (iMLT) to obtain a maximum detection of small events and the best determination of their locations from continuous data recorded by a single azimuth downhole geophone array. As the downhole array is located in one azimuth, the regular M&L using the P-wave cross-correlation only is not able to resolve the location of a matched event relative to the template event. We thus introduce the polarization direction in the matching, which significantly improve the lateral resolution of the M&L method based on numerical simulations with synthetic data. Our synthetic tests further indicate that the inclusion of S-wave cross-correlation data can help better constrain the focal depth of the matched events. We apply this method to a dataset recorded during hydraulic fracturing treatment of a pilot horizontal well within the shale play in southwest China. Our approach yields a more than fourfold increase in the number of located events, compared with the original event catalog from traditional downhole processing.

  6. Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization

    NASA Astrophysics Data System (ADS)

    Dodge, Doug; Walter, William; Myers, Steve; Ford, Sean; Harris, Dave; Ruppert, Stan; Buttler, Dave; Hauk, Terri

    2013-04-01

    The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory(LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.

  7. Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization

    NASA Astrophysics Data System (ADS)

    Dodge, D.; Walter, W. R.; Myers, S. C.; Ford, S. R.; Harris, D.; Ruppert, S.; Buttler, D.; Hauk, T. F.

    2012-12-01

    The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory (LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.

  8. Time-reversal imaging techniques applied to tremor waveforms near Cholame, California to locate tectonic tremor

    NASA Astrophysics Data System (ADS)

    Horstmann, T.; Harrington, R. M.; Cochran, E. S.

    2012-12-01

    Thurber et al. (2006) interpolated to a grid spacing of 50 m. Such grid spacing corresponds to frequencies of up to 8 Hz, which is suitable to calculate the wave propagation of tremor. Our dataset contains continuous broadband data from 13 STS-2 seismometers deployed from May 2010 to July 2011 along the Cholame segment of the San Andreas Fault as well as data from the HRSN and PBO networks. Initial synthetic results from tests on a 2D plane using a line of 15 receivers suggest that we are able to recover accurate event locations to within 100 m horizontally and 300 m depth. We conduct additional synthetic tests to determine the influence of signal-to-noise ratio, number of stations used, and the uncertainty in the velocity model on the location result by adding noise to the seismograms and perturbations to the velocity model. Preliminary results show accurate show location results to within 400 m with a median signal-to-noise ratio of 3.5 and 5% perturbations in the velocity model. The next steps will entail performing the synthetic tests on the 3D velocity model, and applying the method to tremor waveforms. Furthermore, we will determine the spatial and temporal distribution of the source locations and compare our results to those by Sumy and others.

  9. Leisure and Pleasure: Science Events in Unusual Locations

    ERIC Educational Resources Information Center

    Bultitude, Karen; Sardo, Ana Margarida

    2012-01-01

    Building on concepts relating to informal science education, this work compares science-related activities which successfully engaged public audiences at three different "generic" locations: a garden festival, a public park, and a music festival. The purpose was to identify what factors contribute to the perceived success of science…

  10. Event structure and cognitive control.

    PubMed

    Reimer, Jason F; Radvansky, Gabriel A; Lorsbach, Thomas C; Armendarez, Joseph J

    2015-09-01

    Recently, a great deal of research has demonstrated that although everyday experience is continuous in nature, it is parsed into separate events. The aim of the present study was to examine whether event structure can influence the effectiveness of cognitive control. Across 5 experiments we varied the structure of events within the AX-CPT by shifting the spatial location of cues and probes on a computer screen. When location shifts were present, a pattern of AX-CPT performance consistent with enhanced cognitive control was found. To test whether the location shift effects were caused by the presence of event boundaries per se, other aspects of the AX-CPT were manipulated, such as the color of cues and probes and the inclusion of a distractor task during the cue-probe delay. Changes in cognitive control were not found under these conditions, suggesting that the location shift effects were specifically related to the formation of separate event models. Together, these results can be accounted for by the Event Horizon Model and a representation-based theory of cognitive control, and suggest that cognitive control can be influenced by the surrounding environmental structure. (c) 2015 APA, all rights reserved).

  11. Event Structure and Cognitive Control

    PubMed Central

    Reimer, Jason F.; Radvansky, Gabriel A.; Lorsbach, Thomas C.; Armendarez, Joseph J.

    2017-01-01

    Recently, a great deal of research has demonstrated that although everyday experience is continuous in nature, it is parsed into separate events. The aim of the present study was to examine whether event structure can influence the effectiveness of cognitive control. Across five experiments we varied the structure of events within the AX-CPT by shifting the spatial location of cues and probes on a computer screen. When location shifts were present, a pattern of AX-CPT performance consistent with enhanced cognitive control was found. To test whether the location shift effects were caused by the presence of event boundaries per se, other aspects of the AX-CPT were manipulated, such as the color of cues and probes and the inclusion of a distractor task during the cue-probe delay. Changes in cognitive control were not found under these conditions, suggesting that the location shift effects were specifically related to the formation of separate event models. Together, these results can be accounted for by the Event Horizon Model and a representation-based theory of cognitive control, and suggest that cognitive control can be influenced by the surrounding environmental structure. PMID:25603168

  12. The complex architecture of the 2009 MW 6.1 L'Aquila normal fault system (Central Italy) as imaged by 64,000 high-resolution aftershock locations

    NASA Astrophysics Data System (ADS)

    Valoroso, L.; Chiaraluce, L.; Di Stefano, R.; Piccinini, D.; Schaff, D. P.; Waldhauser, F.

    2011-12-01

    On April 6th 2009, a MW 6.1 normal faulting earthquake struck the axial area of the Abruzzo region in Central Italy. We present high-precision hypocenter locations of an extraordinary dataset composed by 64,000 earthquakes recorded at a very dense seismic network of 60 stations operating for 9 months after the main event. Events span in magnitude (ML) between -0.9 to 5.9, reaching a completeness magnitude of 0.7. The dataset has been processed by integrating an accurate automatic picking procedure together with cross-correlation and double-difference relative location methods. The combined use of these procedures results in earthquake relative location uncertainties in the range of a few meters to tens of meters, comparable/lower than the spatial dimension of the earthquakes themselves). This data set allows us to image the complex inner geometry of individual faults from the kilometre to meter scale. The aftershock distribution illuminates the anatomy of the en-echelon fault system composed of two major faults. The mainshock breaks the entire upper crust from 10 km depth to the surface along a 14-km long normal fault. A second segment, located north of the normal fault and activated by two Mw>5 events, shows a striking listric geometry completely blind. We focus on the analysis of about 300 clusters of co-located events to characterize the mechanical behavior of the different portions of the fault system. The number of events in each cluster ranges from 4 to 24 events and they exhibit strongly correlated seismograms at common stations. They mostly occur where secondary structures join the main fault planes and along unfavorably oriented segments. Moreover, larger clusters nucleate on secondary faults located in the overlapping area between the two main segments, where the rate of earthquake production is very high with a long-lasting seismic decay.

  13. Applications of Location Similarity Measures and Conceptual Spaces to Event Coreference and Classification

    ERIC Educational Resources Information Center

    McConky, Katie Theresa

    2013-01-01

    This work covers topics in event coreference and event classification from spoken conversation. Event coreference is the process of identifying descriptions of the same event across sentences, documents, or structured databases. Existing event coreference work focuses on sentence similarity models or feature based similarity models requiring slot…

  14. Automatic processing of induced events in the geothermal reservoirs Landau and Insheim, Germany

    NASA Astrophysics Data System (ADS)

    Olbert, Kai; Küperkoch, Ludger; Meier, Thomas

    2016-04-01

    Induced events can be a risk to local infrastructure that need to be understood and evaluated. They represent also a chance to learn more about the reservoir behavior and characteristics. Prior to the analysis, the waveform data must be processed consistently and accurately to avoid erroneous interpretations. In the framework of the MAGS2 project an automatic off-line event detection and a phase onset time determination algorithm are applied to induced seismic events in geothermal systems in Landau and Insheim, Germany. The off-line detection algorithm works based on a cross-correlation of continuous data taken from the local seismic network with master events. It distinguishes events between different reservoirs and within the individual reservoirs. Furthermore, it provides a location and magnitude estimation. Data from 2007 to 2014 are processed and compared with other detections using the SeisComp3 cross correlation detector and a STA/LTA detector. The detected events are analyzed concerning spatial or temporal clustering. Furthermore the number of events are compared to the existing detection lists. The automatic phase picking algorithm combines an AR-AIC approach with a cost function to find precise P1- and S1-phase onset times which can be used for localization and tomography studies. 800 induced events are processed, determining 5000 P1- and 6000 S1-picks. The phase onset times show a high precision with mean residuals to manual phase picks of 0s (P1) to 0.04s (S1) and standard deviations below ±0.05s. The received automatic picks are applied to relocate a selected number of events to evaluate influences on the location precision.

  15. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhleh, Luay

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbialmore » genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.« less

  16. Estimation of anomaly location and size using electrical impedance tomography.

    PubMed

    Kwon, Ohin; Yoon, Jeong Rock; Seo, Jin Keun; Woo, Eung Je; Cho, Young Gu

    2003-01-01

    We developed a new algorithm that estimates locations and sizes of anomalies in electrically conducting medium based on electrical impedance tomography (EIT) technique. When only the boundary current and voltage measurements are available, it is not practically feasible to reconstruct accurate high-resolution cross-sectional conductivity or resistivity images of a subject. In this paper, we focus our attention on the estimation of locations and sizes of anomalies with different conductivity values compared with the background tissues. We showed the performance of the algorithm from experimental results using a 32-channel EIT system and saline phantom. With about 1.73% measurement error in boundary current-voltage data, we found that the minimal size (area) of the detectable anomaly is about 0.72% of the size (area) of the phantom. Potential applications include the monitoring of impedance related physiological events and bubble detection in two-phase flow. Since this new algorithm requires neither any forward solver nor time-consuming minimization process, it is fast enough for various real-time applications in medicine and nondestructive testing.

  17. Intensity, magnitude, location and attenuation in India for felt earthquakes since 1762

    USGS Publications Warehouse

    Szeliga, Walter; Hough, Susan; Martin, Stacey; Bilham, Roger

    2010-01-01

    A comprehensive, consistently interpreted new catalog of felt intensities for India (Martin and Szeliga, 2010, this issue) includes intensities for 570 earthquakes; instrumental magnitudes and locations are available for 100 of these events. We use the intensity values for 29 of the instrumentally recorded events to develop new intensity versus attenuation relations for the Indian subcontinent and the Himalayan region. We then use these relations to determine the locations and magnitudes of 234 historical events, using the method of Bakun and Wentworth (1997). For the remaining 336 events, intensity distributions are too sparse to determine magnitude or location. We evaluate magnitude and location accuracy of newly located events by comparing the instrumental- with the intensity-derived location for 29 calibration events, for which more than 15 intensity observations are available. With few exceptions, most intensity-derived locations lie within a fault length of the instrumentally determined location. For events in which the azimuthal distribution of intensities is limited, we conclude that the formal error bounds from the regression of Bakun and Wentworth (1997) do not reflect the true uncertainties. We also find that the regression underestimates the uncertainties of the location and magnitude of the 1819 Allah Bund earthquake, for which a location has been inferred from mapped surface deformation. Comparing our inferred attenuation relations to those developed for other regions, we find that attenuation for Himalayan events is comparable to intensity attenuation in California (Bakun and Wentworth, 1997), while intensity attenuation for cratonic events is higher than intensity attenuation reported for central/eastern North America (Bakun et al., 2003). Further, we present evidence that intensities of intraplate earthquakes have a nonlinear dependence on magnitude such that attenuation relations based largely on small-to-moderate earthquakes may significantly

  18. Intensity, magnitude, location, and attenuation in India for felt earthquakes since 1762

    USGS Publications Warehouse

    Szeliga, W.; Hough, S.; Martin, S.; Bilham, R.

    2010-01-01

    A comprehensive, consistently interpreted new catalog of felt intensities for India (Martin and Szeliga, 2010, this issue) includes intensities for 570 earth-quakes; instrumental magnitudes and locations are available for 100 of these events. We use the intensity values for 29 of the instrumentally recorded events to develop new intensity versus attenuation relations for the Indian subcontinent and the Himalayan region. We then use these relations to determine the locations and magnitudes of 234 historical events, using the method of Bakun and Wentworth (1997). For the remaining 336 events, intensity distributions are too sparse to determine magnitude or location. We evaluate magnitude and location accuracy of newly located events by comparing the instrumental-with the intensity-derived location for 29 calibration events, for which more than 15 intensity observations are available. With few exceptions, most intensity-derived locations lie within a fault length of the instrumentally determined location. For events in which the azimuthal distribution of intensities is limited, we conclude that the formal error bounds from the regression of Bakun and Wentworth (1997) do not reflect the true uncertainties. We also find that the regression underestimates the uncertainties of the location and magnitude of the 1819 Allah Bund earthquake, for which a location has been inferred from mapped surface deformation. Comparing our inferred attenuation relations to those developed for other regions, we find that attenuation for Himalayan events is comparable to intensity attenuation in California (Bakun and Wentworth, 1997), while intensity attenuation for cratonic events is higher than intensity attenuation reported for central/eastern North America (Bakun et al., 2003). Further, we present evidence that intensities of intraplate earth-quakes have a nonlinear dependence on magnitude such that attenuation relations based largely on small-to-moderate earthquakes may significantly

  19. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  20. Reconstructing Spatial Distributions from Anonymized Locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horey, James L; Forrest, Stephanie; Groat, Michael

    2012-01-01

    Devices such as mobile phones, tablets, and sensors are often equipped with GPS that accurately report a person's location. Combined with wireless communication, these devices enable a wide range of new social tools and applications. These same qualities, however, leave location-aware applications vulnerable to privacy violations. This paper introduces the Negative Quad Tree, a privacy protection method for location aware applications. The method is broadly applicable to applications that use spatial density information, such as social applications that measure the popularity of social venues. The method employs a simple anonymization algorithm running on mobile devices, and a more complex reconstructionmore » algorithm on a central server. This strategy is well suited to low-powered mobile devices. The paper analyzes the accuracy of the reconstruction method in a variety of simulated and real-world settings and demonstrates that the method is accurate enough to be used in many real-world scenarios.« less

  1. Event Discrimination Using Seismoacoustic Catalog Probabilities

    NASA Astrophysics Data System (ADS)

    Albert, S.; Arrowsmith, S.; Bowman, D.; Downey, N.; Koch, C.

    2017-12-01

    Presented here are three seismoacoustic catalogs from various years and locations throughout Utah and New Mexico. To create these catalogs, we combine seismic and acoustic events detected and located using different algorithms. Seismoacoustic events are formed based on similarity of origin time and location. Following seismoacoustic fusion, the data is compared against ground truth events. Each catalog contains events originating from both natural and anthropogenic sources. By creating these seismoacoustic catalogs, we show that the fusion of seismic and acoustic data leads to a better understanding of the nature of individual events. The probability of an event being a surface blast given its presence in each seismoacoustic catalog is quantified. We use these probabilities to discriminate between events from natural and anthropogenic sources. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  2. Semantic Location Extraction from Crowdsourced Data

    NASA Astrophysics Data System (ADS)

    Koswatte, S.; Mcdougall, K.; Liu, X.

    2016-06-01

    Crowdsourced Data (CSD) has recently received increased attention in many application areas including disaster management. Convenience of production and use, data currency and abundancy are some of the key reasons for attracting this high interest. Conversely, quality issues like incompleteness, credibility and relevancy prevent the direct use of such data in important applications like disaster management. Moreover, location information availability of CSD is problematic as it remains very low in many crowd sourced platforms such as Twitter. Also, this recorded location is mostly related to the mobile device or user location and often does not represent the event location. In CSD, event location is discussed descriptively in the comments in addition to the recorded location (which is generated by means of mobile device's GPS or mobile communication network). This study attempts to semantically extract the CSD location information with the help of an ontological Gazetteer and other available resources. 2011 Queensland flood tweets and Ushahidi Crowd Map data were semantically analysed to extract the location information with the support of Queensland Gazetteer which is converted to an ontological gazetteer and a global gazetteer. Some preliminary results show that the use of ontologies and semantics can improve the accuracy of place name identification of CSD and the process of location information extraction.

  3. Solutions for Coding Societal Events

    DTIC Science & Technology

    2016-12-01

    develop a prototype system for civil unrest event extraction, and (3) engineer BBN ACCENT (ACCurate Events from Natural Text ) to support broad use by...56 iv List of Tables Table 1: Features in similarity metric. Abbreviations are as follows. TG: text graph...extraction of a stream of events (e.g. protests, attacks, etc.) from unstructured text (e.g. news, social media). This technical report presents results

  4. Magnetic location of C IV events in the quiet network

    NASA Technical Reports Server (NTRS)

    Porter, Jason G.; Reichmann, Ed J.; Moore, Ronald L.; Harvey, Karen L.

    1986-01-01

    Ultraviolet Spectrograph and Polarimeter (UVSP) observations of C IV intensity in the quiet sun were examined and compared to magnetograms and He I 10830 A spectroheliograms from Kitt Peak National Laboratory. The observations were made between 3 and 9 April, 1985. Spatially rastered UVSP intensity measurements were obtained at 11 wavelength positions in the 1548 A line of C IV. It was concluded that the stochastic process whereby convective shuffling of loop footprints leads to many topically dissipative events in active regions and the larger bipoles treated here continues to operate in regions of fewer, weaker flux loops, but the resulting events above threshold are less frequent.

  5. Surface Properties Associated With Dust Storm Plume's Point-Source Locations In The Border Region Of The US And Mexico

    NASA Astrophysics Data System (ADS)

    Bleiweiss, M. P.; DuBois, D. W.; Flores, M. I.

    2013-12-01

    Dust storms in the border region of the Southwest US and Northern Mexico are a serious problem for air quality (PM10 exceedances), health (Valley Fever is pandemic in the region) and transportation (road closures and deadly traffic accidents). In order to better understand the phenomena, we are attempting to identify critical characteristics of dust storm sources so that, possibly, one can perform more accurate predictions of events and, thus, mitigate some of the deleterious effects. Besides the emission mechanisms for dust storm production that are tied to atmospheric dynamics, one must know those locations whose source characteristics can be tied to dust production and, therefore, identify locations where a dust storm is eminent under favorable atmospheric dynamics. During the past 13 years, we have observed, on satellite imagery, more than 500 dust events in the region and are in the process of identifying the source regions for the dust plumes that make up an event. Where satellite imagery exists with high spatial resolution (less than or equal to 250m), dust 'plumes' appear to be made up of individual and merged plumes that are emitted from a 'point source' (smaller than the resolution of the imagery). In particular, we have observed events from the ASTER sensor whose spatial resolution is 15m as well as Landsat whose spatial resolution is 30m. Tying these source locations to surface properties such as NDVI, albedo, and soil properties (percent sand, silt, clay, and gravel; soil moisture; etc.) will identify regions with enhanced capability to produce a dust storm. This, along with atmospheric dynamics, will allow the forecast of dust events. The analysis of 10 events from the period 2004-2013, for which we have identified 1124 individual plumes, will be presented.

  6. Lunar Impact Flash Locations from NASA's Lunar Impact Monitoring Program

    NASA Technical Reports Server (NTRS)

    Moser, D. E.; Suggs, R. M.; Kupferschmidt, L.; Feldman, J.

    2015-01-01

    dependent upon LRO finding a fresh impact crater associated with one of the impact flashes recorded by Earth-based instruments, either the bright event of March 2013 or any other in the database of impact observations. To find the crater, LRO needed an accurate area to search. This Technical Memorandum (TM) describes the geolocation technique developed to accurately determine the impact flash location, and by association, the location of the crater, thought to lie directly beneath the brightest portion of the flash. The workflow and software tools used to geolocate the impact flashes are described in detail, along with sources of error and uncertainty and a case study applying the workflow to the bright impact flash in March 2013. Following the successful geolocation of the March 2013 flash, the technique was applied to all impact flashes detected by the MEO between November 7, 2005, and January 3, 2014.

  7. Awakening the BALROG: BAyesian Location Reconstruction Of GRBs

    NASA Astrophysics Data System (ADS)

    Burgess, J. Michael; Yu, Hoi-Fung; Greiner, Jochen; Mortlock, Daniel J.

    2018-05-01

    The accurate spatial location of gamma-ray bursts (GRBs) is crucial for both accurately characterizing their spectra and follow-up observations by other instruments. The Fermi Gamma-ray Burst Monitor (GBM) has the largest field of view for detecting GRBs as it views the entire unocculted sky, but as a non-imaging instrument it relies on the relative count rates observed in each of its 14 detectors to localize transients. Improving its ability to accurately locate GRBs and other transients is vital to the paradigm of multimessenger astronomy, including the electromagnetic follow-up of gravitational wave signals. Here we present the BAyesian Location Reconstruction Of GRBs (BALROG) method for localizing and characterizing GBM transients. Our approach eliminates the systematics of previous approaches by simultaneously fitting for the location and spectrum of a source. It also correctly incorporates the uncertainties in the location of a transient into the spectral parameters and produces reliable positional uncertainties for both well-localized sources and those for which the GBM data cannot effectively constrain the position. While computationally expensive, BALROG can be implemented to enable quick follow-up of all GBM transient signals. Also, we identify possible response problems that require attention and caution when using standard, public GBM detector response matrices. Finally, we examine the effects of including the uncertainty in location on the spectral parameters of GRB 080916C. We find that spectral parameters change and no extra components are required when these effects are included in contrast to when we use a fixed location. This finding has the potential to alter both the GRB spectral catalogues and the reported spectral composition of some well-known GRBs.

  8. The DPRK event of May 25, 2009: an analysis carried out at INGV from a multidisciplinary perspective

    NASA Astrophysics Data System (ADS)

    Console, R.; Carluccio, R.; Chiappini, M.; Chiappini, S.; D'Ajello Caracciolo, F.; Damiani, K.; Giuntini, A.; Materni, V.; Pignatelli, A.

    2009-12-01

    On the early morning of May 25, 2009 INGV detected an event located in a predefined area of interest, including the DPRK. Such detection triggered the request of raw data from the seismological International Agencies that operate global seismic networks. Around 6:00 UTC of the same morning, the INGV staff started the standard procedures of seismological analysis on the data collected from such Agencies, in order to locate, identify and characterize the event from a National perspective. At the same time, the DPRK Government announced the conduction of an underground nuclear test in their territory, confirming the suspected explosive nature of the seismic event. The seismological analysis carried out at the INGV included hypocentral location, mb and Ms computation, application of identification criteria developed at the INGV, and estimation of a possible range for the yield. Here the basic parameters for the event, as obtained at the INGV are reported: Origin time: 2009/05/25 00:54:43.039 Latitude: 41.286 deg. N +/- 12.214 km. Longitude: 129.174 deg. E +/- 14.767 km. Depth: 0 (fixed by the analyst) mb: 4.5 +/- 0.1; Ms: 3.2 +/- 0.2 The criteria adopted at the INGV for event screening led us to classify this event as an explosion with high probability. To reach this conclusion, a rigorous statistical method known as “Discriminant Analysis” has been applied. Particular care has been devoted to the comparison with the nuclear test announced by the DPRK on 9 October 2006. The two locations appear very close to each other (within a distance of the order of 10 km), with the respective error ellipses nearly totally overlapping (the error ellipse area of the recent event is smaller due to the better quality and more numerous recordings). A more accurate relative location has been carried out by the application of the algorithm of DDJHD specifically developed at the INGV for these purposes. In this case the epicentral distance drops to less than 3 km, with an error of 3

  9. BlueDetect: An iBeacon-Enabled Scheme for Accurate and Energy-Efficient Indoor-Outdoor Detection and Seamless Location-Based Service

    PubMed Central

    Zou, Han; Jiang, Hao; Luo, Yiwen; Zhu, Jianjie; Lu, Xiaoxuan; Xie, Lihua

    2016-01-01

    The location and contextual status (indoor or outdoor) is fundamental and critical information for upper-layer applications, such as activity recognition and location-based services (LBS) for individuals. In addition, optimizations of building management systems (BMS), such as the pre-cooling or heating process of the air-conditioning system according to the human traffic entering or exiting a building, can utilize the information, as well. The emerging mobile devices, which are equipped with various sensors, become a feasible and flexible platform to perform indoor-outdoor (IO) detection. However, power-hungry sensors, such as GPS and WiFi, should be used with caution due to the constrained battery storage on mobile device. We propose BlueDetect: an accurate, fast response and energy-efficient scheme for IO detection and seamless LBS running on the mobile device based on the emerging low-power iBeacon technology. By leveraging the on-broad Bluetooth module and our proposed algorithms, BlueDetect provides a precise IO detection service that can turn on/off on-board power-hungry sensors smartly and automatically, optimize their performances and reduce the power consumption of mobile devices simultaneously. Moreover, seamless positioning and navigation services can be realized by it, especially in a semi-outdoor environment, which cannot be achieved by GPS or an indoor positioning system (IPS) easily. We prototype BlueDetect on Android mobile devices and evaluate its performance comprehensively. The experimental results have validated the superiority of BlueDetect in terms of IO detection accuracy, localization accuracy and energy consumption. PMID:26907295

  10. Automatic picker of P & S first arrivals and robust event locator

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Polozov, A.; Hofstetter, A.

    2003-12-01

    We report on further development of automatic all distances location procedure designed for a regional network. The procedure generalizes the previous "loca l" (R < 500 km) and "regional" (500 < R < 2000 km) routines and comprises: a) preliminary data processing (filtering and de-spiking), b) phase identificatio n, c) P, S first arrival picking, d) preliminary location and e) robust grid-search optimization procedure. Innovations concern phase identification, automa tic picking and teleseismic location. A platform free flexible Java interface was recently created, allowing easy parameter tuning and on/off switching to t he full-scale manual picking mode. Identification of the regional P and S phase is provided by choosing between the two largest peaks in the envelope curve. For automatic on-time estimation we utilize now ratio of two STAs, calculated in two consecutive and equal time windows (instead of previously used Akike Information Criterion). "Teleseismic " location is split in two stages: preliminary and final one. The preliminary part estimates azimuth and apparent velocity by fitting a plane wave to the P automatic pickings. The apparent velocity criterion is used to decide about strategy of the following computations: teleseismic or regional. The preliminary estimates of azimuth and apparent velocity provide starting value for the final teleseismic and regional location. Apparent velocity is used to get first a pproximation distance to the source on the basis of the P, Pn, Pg travel-timetables. The distance estimate together with the preliminary azimuth estimate provides first approximations of the source latitude and longitude via sine and cosine theorems formulated for the spherical triangle. Final location is based on robust grid-search optimization procedure, weighting the number of pickings that simultaneously fit the model travel times. The grid covers initial location and becomes finer while approaching true hypocenter. The target function is a sum

  11. A single geophone to locate seismic events on Mars

    NASA Astrophysics Data System (ADS)

    Roques, Aurélien; Berenguer, Jean-Luc; Bozdag, Ebru

    2016-04-01

    Knowing the structure of Mars is a key point in understanding the formation of Earth-like planets as plate tectonics and erosion have erased the original suface of the Earth formation. Installing a seismometer on Mars surface makes it possible to identify its structure. An important step in the identification of the structure of a planet is the epicenter's location of a seismic source, typically a meteoric impact or an earthquake. On Earth, the classical way of locating epicenters is triangulation, which requires at least 3 stations. The Mars InSight Project plans to set a single station with 3 components. We propose a software to locate seismic sources on Mars thanks to the 3-components simulated data of an earthquake given by Geoazur (Nice Sophia-Antipolis University, CNRS) researchers. Instrumental response of a sensor is crucial for data interpretation. We study the oscillations of geophone in several situations so as to awaken students to the meaning of damping in second order modeling. In physics, car shock absorbers are often used to illustrate the principle of damping but rarely in practical experiments. We propose the use of a simple seismometer (a string with a mass and a damper) that allows changing several parameters (inductive damping, temperature and pressure) so as to see the effects of these parameters on the impulse response and, in particular, on the damping coefficient. In a second step, we illustrate the effect of damping on a seismogram with the difficulty of identifying and interpreting the different phase arrival times with low damping.

  12. Modal Acoustic Emission Used at Elevated Temperatures to Detect Damage and Failure Location in Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Morscher, Gregory N.

    1999-01-01

    Ceramic matrix composites are being developed for elevated-temperature engine applications. A leading material system in this class of materials is silicon carbide (SiC) fiber-reinforced SiC matrix composites. Unfortunately, the nonoxide fibers, matrix, and interphase (boron nitride in this system) can react with oxygen or water vapor in the atmosphere, leading to strength degradation of the composite at elevated temperatures. For this study, constant-load stress-rupture tests were performed in air at temperatures ranging from 815 to 960 C until failure. From these data, predictions can be made for the useful life of such composites under similar stressed-oxidation conditions. During these experiments, the sounds of failure events (matrix cracking and fiber breaking) were monitored with a modal acoustic emission (AE) analyzer through transducers that were attached at the ends of the tensile bars. Such failure events, which are caused by applied stress and oxidation reactions, cause these composites to fail prematurely. Because of the nature of acoustic waveform propagation in thin tensile bars, the location of individual source events and the eventual failure event could be detected accurately.

  13. STRIDE: Species Tree Root Inference from Gene Duplication Events.

    PubMed

    Emms, David M; Kelly, Steven

    2017-12-01

    The correct interpretation of any phylogenetic tree is dependent on that tree being correctly rooted. We present STRIDE, a fast, effective, and outgroup-free method for identification of gene duplication events and species tree root inference in large-scale molecular phylogenetic analyses. STRIDE identifies sets of well-supported in-group gene duplication events from a set of unrooted gene trees, and analyses these events to infer a probability distribution over an unrooted species tree for the location of its root. We show that STRIDE correctly identifies the root of the species tree in multiple large-scale molecular phylogenetic data sets spanning a wide range of timescales and taxonomic groups. We demonstrate that the novel probability model implemented in STRIDE can accurately represent the ambiguity in species tree root assignment for data sets where information is limited. Furthermore, application of STRIDE to outgroup-free inference of the origin of the eukaryotic tree resulted in a root probability distribution that provides additional support for leading hypotheses for the origin of the eukaryotes. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  14. Accurately Mapping M31's Microlensing Population

    NASA Astrophysics Data System (ADS)

    Crotts, Arlin

    2004-07-01

    We propose to augment an existing microlensing survey of M31 with source identifications provided by a modest amount of ACS {and WFPC2 parallel} observations to yield an accurate measurement of the masses responsible for microlensing in M31, and presumably much of its dark matter. The main benefit of these data is the determination of the physical {or "einstein"} timescale of each microlensing event, rather than an effective {"FWHM"} timescale, allowing masses to be determined more than twice as accurately as without HST data. The einstein timescale is the ratio of the lensing cross-sectional radius and relative velocities. Velocities are known from kinematics, and the cross-section is directly proportional to the {unknown} lensing mass. We cannot easily measure these quantities without knowing the amplification, hence the baseline magnitude, which requires the resolution of HST to find the source star. This makes a crucial difference because M31 lens m ass determinations can be more accurate than those towards the Magellanic Clouds through our Galaxy's halo {for the same number of microlensing events} due to the better constrained geometry in the M31 microlensing situation. Furthermore, our larger survey, just completed, should yield at least 100 M31 microlensing events, more than any Magellanic survey. A small amount of ACS+WFPC2 imaging will deliver the potential of this large database {about 350 nights}. For the whole survey {and a delta-function mass distribution} the mass error should approach only about 15%, or about 6% error in slope for a power-law distribution. These results will better allow us to pinpoint the lens halo fraction, and the shape of the halo lens spatial distribution, and allow generalization/comparison of the nature of halo dark matter in spiral galaxies. In addition, we will be able to establish the baseline magnitude for about 50, 000 variable stars, as well as measure an unprecedentedly deta iled color-magnitude diagram and luminosity

  15. Location accuracy evaluation of lightning location systems using natural lightning flashes recorded by a network of high-speed cameras

    NASA Astrophysics Data System (ADS)

    Alves, J.; Saraiva, A. C. V.; Campos, L. Z. D. S.; Pinto, O., Jr.; Antunes, L.

    2014-12-01

    This work presents a method for the evaluation of location accuracy of all Lightning Location System (LLS) in operation in southeastern Brazil, using natural cloud-to-ground (CG) lightning flashes. This can be done through a multiple high-speed cameras network (RAMMER network) installed in the Paraiba Valley region - SP - Brazil. The RAMMER network (Automated Multi-camera Network for Monitoring and Study of Lightning) is composed by four high-speed cameras operating at 2,500 frames per second. Three stationary black-and-white (B&W) cameras were situated in the cities of São José dos Campos and Caçapava. A fourth color camera was mobile (installed in a car), but operated in a fixed location during the observation period, within the city of São José dos Campos. The average distance among cameras was 13 kilometers. Each RAMMER sensor position was determined so that the network can observe the same lightning flash from different angles and all recorded videos were GPS (Global Position System) time stamped, allowing comparisons of events between cameras and the LLS. The RAMMER sensor is basically composed by a computer, a Phantom high-speed camera version 9.1 and a GPS unit. The lightning cases analyzed in the present work were observed by at least two cameras, their position was visually triangulated and the results compared with BrasilDAT network, during the summer seasons of 2011/2012 and 2012/2013. The visual triangulation method is presented in details. The calibration procedure showed an accuracy of 9 meters between the accurate GPS position of the object triangulated and the result from the visual triangulation method. Lightning return stroke positions, estimated with the visual triangulation method, were compared with LLS locations. Differences between solutions were not greater than 1.8 km.

  16. Lightning Location Using Acoustic Signals

    NASA Astrophysics Data System (ADS)

    Badillo, E.; Arechiga, R. O.; Thomas, R. J.

    2013-05-01

    In the summer of 2011 and 2012 a network of acoustic arrays was deployed in the Magdalena mountains of central New Mexico to locate lightning flashes. A Times-Correlation (TC) ray-tracing-based-technique was developed in order to obtain the location of lightning flashes near the network. The TC technique, locates acoustic sources from lightning. It was developed to complement the lightning location of RF sources detected by the Lightning Mapping Array (LMA) developed at Langmuir Laboratory, in New Mexico Tech. The network consisted of four arrays with four microphones each. The microphones on each array were placed in a triangular configuration with one of the microphones in the center of the array. The distance between the central microphone and the rest of them was about 30 m. The distance between centers of the arrays ranged from 500 m to 1500 m. The TC technique uses times of arrival (TOA) of acoustic waves to trace back the location of thunder sources. In order to obtain the times of arrival, the signals were filtered in a frequency band of 2 to 20 hertz and cross-correlated. Once the times of arrival were obtained, the Levenberg-Marquardt algorithm was applied to locate the spatial coordinates (x,y, and z) of thunder sources. Two techniques were used and contrasted to compute the accuracy of the TC method: Nearest-Neighbors (NN), between acoustic and LMA located sources, and standard deviation from the curvature matrix of the system as a measure of dispersion of the results. For the best case scenario, a triggered lightning event, the TC method applied with four microphones, located sources with a median error of 152 m and 142.9 m using nearest-neighbors and standard deviation respectively.; Results of the TC method in the lightning event recorded at 18:47:35 UTC, August 6, 2012. Black dots represent the results computed. Light color dots represent the LMA data for the same event. The results were obtained with the MGTM station (four channels). This figure

  17. High-precision source location of the 1978 November 19 gamma-ray burst

    NASA Technical Reports Server (NTRS)

    Cline, T. L.; Desai, U. D.; Teegarden, B. J.; Pizzichini, G.; Evans, W. D.; Klebesadel, R. W.; Laros, J. G.; Barat, C.; Hurley, K.; Niel, M.

    1981-01-01

    The celestial source location of the November 19, 1978, intense gamma ray burst has been determined from data obtained with the interplanetary gamma-ray sensor network by means of long-baseline wave front timing instruments. Each of the instruments was designed for studying events with observable spectra of approximately greater than 100 keV, and each provides accurate event profile timing in the several millisecond range. The data analysis includes the following: the triangulated region is centered at (gamma, delta) 1950 = (1h16m32s, -28 deg 53 arcmin), at -84 deg galactic latitude, where the star density is very low and the obscuration negligible. The gamma-ray burst source region, consistent with that of a highly polarized radio source described by Hjellming and Ewald (1981), may assist in the source modeling and may facilitate the understanding of the source process. A marginally identifiable X-ray source was also found by an Einstein Observatory investigation. It is concluded that the burst contains redshifted positron annihilation and nuclear first-excited iron lines, which is consistent with a neutron star origin.

  18. The 2008 Wells, Nevada Earthquake Sequence: Application of Subspace Detection and Multiple Event Relocation Techniques

    NASA Astrophysics Data System (ADS)

    Nealy, J. L.; Benz, H.; Hayes, G. P.; Bergman, E.; Barnhart, W. D.

    2016-12-01

    On February 21, 2008 at 14:16:02 (UTC), Wells, Nevada experienced a Mw 6.0 earthquake, the largest earthquake in the state within the past 50 years. Here, we re-analyze in detail the spatiotemporal variations of the foreshock and aftershock sequence and compare the distribution of seismicity to a recent slip model based on inversion of InSAR observations. A catalog of earthquakes for the time period of February 1, 2008 through August 31, 2008 was derived from a combination of arrival time picks using a kurtosis detector (primarily P arrival times), subspace detector (primarily S arrival times), associating the combined pick dataset, and applying multiple event relocation techniques using the 19 closest USArray Transportable Array stations, permanent regional seismic monitoring stations in Nevada and Utah, and temporary stations deployed for an aftershock study. We were able to detect several thousand earthquakes in the months following the mainshock as well as several foreshocks in the days leading up to the event. We reviewed the picks for the largest 986 earthquakes and relocated them using the Hypocentroidal Decomposition (HD) method. The HD technique provides both relative locations for the individual earthquakes and an absolute location for the earthquake cluster, resulting in absolute locations of the events in the cluster having minimal bias from unknown Earth structure. A subset of these "calibrated" earthquake locations that spanned the duration of the sequence and had small uncertainties in location were used as prior constraints within a second relocation effort using the entire dataset and the Bayesloc approach. Accurate locations (to within 2 km) were obtained using Bayesloc for 1,952 of the 2,157 events associated over the seven-month period of the study. The final catalog of earthquake hypocenters indicates that the aftershocks extend for about 20 km along the strike of the ruptured fault. The aftershocks occur primarily updip and along the

  19. Seismic and Aseismic Behavior of the Altotiberina Low-angle Normal Fault System (Northern Apennines, Italy) through High-resolution Earthquake Locations and Repeating Events

    NASA Astrophysics Data System (ADS)

    Valoroso, L.; Chiaraluce, L.

    2017-12-01

    Low-angle normal faults (dip < 30°) are geologically widely documented and considered responsible for accommodating the crustal extension within the brittle crust although their mechanical behavior and seismogenic potential is enigmatic. We study the anatomy and slip-behavior of the actively slipping Altotiberina low-angle (ATF) normal fault system using a high-resolution 5-years-long (2010-2014) earthquake catalogue composed of 37k events (ML<3.9 and completeness magnitude MC=0.5 ML), recorded by a dense permanent seismic network of the Altotiberina Near Fault Observatory (TABOO). The seismic activity defines the fault system dominated at depth by the low-angle ATF surface (15-20°) coinciding to the ATF geometry imaged through seismic reflection data. The ATF extends for 50km along-strike and between 4-5 to 16km of depth. Seismicity also images the geometry of a set of higher angle faults (35-50°) located in the ATF hanging-wall (HW). The ATF-related seismicity accounts for 10% of the whole seismicity (3,700 events with ML<2.4), occurring at a remarkably constant rate of 2.2 events/day. This seismicity describes an about 1.5-km-thick fault zone composed by multiple sub-parallel slipping planes. The remaining events are instead organized in multiple mainshocks (MW>3) seismic sequences lasting from weeks to months, activating a contiguous network of 3-5-km-long syn- and antithetic fault segments within the ATF-HW. The space-time evolution of these minor sequences is consistent with subsequence failures promoted by fluid flow. The ATF-seismicity pattern includes 97 clusters of repeating events (RE) made of 299 events with ML<1.9. RE are located around locked patches identified by geodetic modeling, suggesting a mixed-mode (stick-slip and stable-sliding) slip-behavior along the fault plane in accommodating most of the NE-trending tectonic deformation with creeping dominating below 5 km depth. Consistently, the seismic moment released by the ATF-seismicity accounts

  20. Low frequency events on Montserrat

    NASA Astrophysics Data System (ADS)

    Visser, K.; Neuberg, J.

    2003-04-01

    Earthquake swarms observed on volcanoes consist generally of low frequency events. The low frequency content of these events indicates the presence of interface waves at the boundary of the magma filled conduit and the surrounding country rock. The observed seismic signal at the surface shows therefore a complicated interference pattern of waves originating at various parts of the magma filled conduit, interacting with the free surface and interfaces in the volcanic edifice. This research investigates the applicability of conventional seismic tools on these low frequency events, focusing on hypocenter location analysis using arrival times and particle motion analysis for the Soufrière Hills Volcano on Montserrat. Both single low frequency events and swarms are observed on this volcano. Synthetic low frequency events are used for comparison. Results show that reliable hypocenter locations and particle motions can only be obtained if the low frequency events are single events with an identifiable P wave onset, for example the single events preceding swarms on Montserrat or the first low frequency event of a swarm. Consecutive events of the same swarm are dominated by interface waves which are converted at the top of the conduit into weak secondary P waves and surface waves. Conventional seismic tools fail to correctly analyse these events.

  1. Heat Retreat Locations in Cities - The Survey-Based Location Analysis of Heat Relief

    NASA Astrophysics Data System (ADS)

    Neht, Alice; Maximini, Claudia; Prenger-Berninghoff, Kathrin

    2017-12-01

    The adaptation of cities to climate change effects is one of the major strategies in urban planning to encounter the challenges of climate change (IPCC 2014). One of the fields of climate change adaption is dealing with heat events that occur more frequently and with greater intensity. Cities in particular are vulnerable to these events due to high population and infrastructure density. Proceeding urbanization calls for the existence of sufficient heat retreat locations (HRL) to enable relief for the population from heat in summer. This is why an extensive analysis of HRL is needed. This paper aims at the development of a survey-based location analysis of heat relief by identifying user groups, locations and characteristics of HRL based on a home survey that was conducted in three German cities. Key results of the study show that the majority of the participants of the survey are users of existing HRL, are affected by heat, and perceive heat as a burden in summer. Moreover, HRL that are located in close proximity are preferred by most users while their effect depends on the regional context that has to be considered in the analysis. Hence, this research presents an approach to heat relief that underlines the importance of HRL in cities by referring to selected examples of HRL types in densely populated areas of cities. HRL should especially be established and secured in densely built-up areas of cities. According to results of the survey, most HRL are located in public spaces, and the overall accessibility of HRL turned out to be an issue.

  2. TrigDB back-filling method in EEW for the regional earthquake for reducing false location of the deep focus earthquake event by considering neighborhood triggers and forced association.

    NASA Astrophysics Data System (ADS)

    Park, J. H.; Chi, H. C.; Lim, I. S.; Seong, Y. J.; Pak, J.

    2017-12-01

    During the first phase of EEW(Earthquake Early Warning) service to the public by KMA (Korea Meteorological Administration) from 2015 in Korea, KIGAM(Korea Institute of Geoscience and Mineral Resources) has adopted ElarmS2 of UC Berkeley BSL and modified local magnitude relation, travel time curves and association procedures so called TrigDB back-filling method. The TrigDB back-filling method uses a database of sorted lists of stations based on epicentral distances of the pre-defined events located on the grids for 1,401 × 1,601 = 2,243,001 events around the Korean Peninsula at a grid spacing of 0.05 degrees. When the version of an event is updated, the TrigDB back-filling method is invoked. First, the grid closest to the epicenter of an event is chosen from the database and candidate stations, which are stations corresponding to the chosen grid and also adjacent to the already-associated stations, are selected. Second, the directions from the chosen grid to the associated stations are averaged to represent the direction of wave propagation, which is used as a reference for computing apparent travel times. The apparent travel times for the associated stations are computed using a P wave velocity of 5.5 km/s from the grid to the projected points in the reference direction. The travel times for the triggered candidate stations are also computed and used to obtain the difference between the apparent travel times of the associated stations and the triggered candidates. Finally, if the difference in the apparent travel times is less than that of the arrival times, the method forces the triggered candidate station to be associated with the event and updates the event location. This method is useful to reduce false locations of events which could be generated from the deep (> 500 km) and regional distance earthquakes happening on the subduction pacific plate boundaries. In comparison of the case study between TrigDB back-filling applied system and the others, we could get

  3. Manually locating physical and virtual reality objects.

    PubMed

    Chen, Karen B; Kimmel, Ryan A; Bartholomew, Aaron; Ponto, Kevin; Gleicher, Michael L; Radwin, Robert G

    2014-09-01

    In this study, we compared how users locate physical and equivalent three-dimensional images of virtual objects in a cave automatic virtual environment (CAVE) using the hand to examine how human performance (accuracy, time, and approach) is affected by object size, location, and distance. Virtual reality (VR) offers the promise to flexibly simulate arbitrary environments for studying human performance. Previously, VR researchers primarily considered differences between virtual and physical distance estimation rather than reaching for close-up objects. Fourteen participants completed manual targeting tasks that involved reaching for corners on equivalent physical and virtual boxes of three different sizes. Predicted errors were calculated from a geometric model based on user interpupillary distance, eye location, distance from the eyes to the projector screen, and object. Users were 1.64 times less accurate (p < .001) and spent 1.49 times more time (p = .01) targeting virtual versus physical box corners using the hands. Predicted virtual targeting errors were on average 1.53 times (p < .05) greater than the observed errors for farther virtual targets but not significantly different for close-up virtual targets. Target size, location, and distance, in addition to binocular disparity, affected virtual object targeting inaccuracy. Observed virtual box inaccuracy was less than predicted for farther locations, suggesting possible influence of cues other than binocular vision. Human physical interaction with objects in VR for simulation, training, and prototyping involving reaching and manually handling virtual objects in a CAVE are more accurate than predicted when locating farther objects.

  4. Multi-spacecraft observations and transport simulations of solar energetic particles for the May 17th 2012 event

    NASA Astrophysics Data System (ADS)

    Battarbee, M.; Guo, J.; Dalla, S.; Wimmer-Schweingruber, R.; Swalwell, B.; Lawrence, D. J.

    2018-05-01

    Context. The injection, propagation and arrival of solar energetic particles (SEPs) during eruptive solar events is an important and current research topic of heliospheric physics. During the largest solar events, particles may have energies up to a few GeVs and sometimes even trigger ground-level enhancements (GLEs) at Earth. These large SEP events are best investigated through multi-spacecraft observations. Aims: We aim to study the first GLE-event of solar cycle 24, from 17th May 2012, using data from multiple spacecraft (SOHO, GOES, MSL, STEREO-A, STEREO-B and MESSENGER). These spacecraft are located throughout the inner heliosphere, at heliocentric distances between 0.34 and 1.5 astronomical units (au), covering nearly the whole range of heliospheric longitudes. Methods: We present and investigate sub-GeV proton time profiles for the event at several energy channels, obtained via different instruments aboard the above spacecraft. We investigated issues caused by magnetic connectivity, and present results of three-dimensional SEP propagation simulations. We gathered virtual time profiles and perform qualitative and quantitative comparisons with observations, assessed longitudinal injection and transport effects as well as peak intensities. Results: We distinguish different time profile shapes for well-connected and weakly connected observers, and find our onset time analysis to agree with this distinction. At select observers, we identify an additional low-energy component of Energetic Storm Particles (ESPs). Using well-connected observers for normalisation, our simulations are able to accurately recreate both time profile shapes and peak intensities at multiple observer locations. Conclusions: This synergetic approach combining numerical modelling with multi-spacecraft observations is crucial for understanding the propagation of SEPs within the interplanetary magnetic field. Our novel analysis provides valuable proof of the ability to simulate SEP propagation

  5. Development of a Method to Obtain More Accurate General and Oral Health Related Information Retrospectively

    PubMed Central

    A, Golkari; A, Sabokseir; D, Blane; A, Sheiham; RG, Watt

    2017-01-01

    Statement of Problem: Early childhood is a crucial period of life as it affects one’s future health. However, precise data on adverse events during this period is usually hard to access or collect, especially in developing countries. Objectives: This paper first reviews the existing methods for retrospective data collection in health and social sciences, and then introduces a new method/tool for obtaining more accurate general and oral health related information from early childhood retrospectively. Materials and Methods: The Early Childhood Events Life-Grid (ECEL) was developed to collect information on the type and time of health-related adverse events during the early years of life, by questioning the parents. The validity of ECEL and the accuracy of information obtained by this method were assessed in a pilot study and in a main study of 30 parents of 8 to 11 year old children from Shiraz (Iran). Responses obtained from parents using the final ECEL were compared with the recorded health insurance documents. Results: There was an almost perfect agreement between the health insurance and ECEL data sets (Kappa value=0.95 and p < 0.001). Interviewees remembered the important events more accurately (100% exact timing match in case of hospitalization). Conclusions: The Early Childhood Events Life-Grid method proved to be highly accurate when compared with recorded medical documents. PMID:28959773

  6. Human Rights Event Detection from Heterogeneous Social Media Graphs.

    PubMed

    Chen, Feng; Neill, Daniel B

    2015-03-01

    Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.

  7. Multi-ball and one-ball geolocation and location verification

    NASA Astrophysics Data System (ADS)

    Nelson, D. J.; Townsend, J. L.

    2017-05-01

    We present analysis methods that may be used to geolocate emitters using one or more moving receivers. While some of the methods we present may apply to a broader class of signals, our primary interest is locating and tracking ships from short pulsed transmissions, such as the maritime Automatic Identification System (AIS.) The AIS signal is difficult to process and track since the pulse duration is only 25 milliseconds, and the pulses may only be transmitted every six to ten seconds. Several fundamental problems are addressed, including demodulation of AIS/GMSK signals, verification of the emitter location, accurate frequency and delay estimation and identification of pulse trains from the same emitter. In particular, we present several new correlation methods, including cross-cross correlation that greatly improves correlation accuracy over conventional methods and cross- TDOA and cross-FDOA functions that make it possible to estimate time and frequency delay without the need of computing a two dimensional cross-ambiguity surface. By isolating pulses from the same emitter and accurately tracking the received signal frequency, we are able to accurately estimate the emitter location from the received Doppler characteristics.

  8. Seismicity in Pennsylvania: Evidence for Anthropogenic Events?

    NASA Astrophysics Data System (ADS)

    Homman, K.; Nyblade, A.

    2015-12-01

    The deployment and operation of the USArray Transportable Array (TA) and the PASEIS (XY) seismic networks in Pennsylvania during 2013 and 2014 provide a unique opportunity for investigating the seismicity of Pennsylvania. These networks, along with several permanent stations in Pennsylvania, resulted in a total of 104 seismometers in and around Pennsylvania that have been used in this study. Event locations were first obtained with Antelope Environmental Monitoring Software using P-wave arrival times. Arrival times were hand picked using a 1-5 Hz bandpass filter to within 0.1 seconds. Events were then relocated using a velocity model developed for Pennsylvania and the HYPOELLIPSE location code. In this study, 1593 seismic events occurred between February 2013 and December 2014 in Pennsylvania. These events ranged between magnitude (ML) 1.04 and 2.89 with an average MLof 1.90. Locations of the events occur across the state in many areas where no seismicity has been previously reported. Preliminary results indicate that most of these events are related to mining activity. Additional work using cross-correlation techniques is underway to examine a number of event clusters for evidence of hydraulic fracturing or wastewater injection sources.

  9. Search for gamma-ray events in the BATSE data base

    NASA Technical Reports Server (NTRS)

    Lewin, Walter

    1994-01-01

    We find large location errors and error radii in the locations of channel 1 Cygnus X-1 events. These errors and their associated uncertainties are a result of low signal-to-noise ratios (a few sigma) in the two brightest detectors for each event. The untriggered events suffer from similarly low signal-to-noise ratios, and their location errors are expected to be at least as large as those found for Cygnus X-1 with a given signal-to-noise ratio. The statistical error radii are consistent with those found for Cygnus X-1 and with the published estimates. We therefore expect approximately 20 - 30 deg location errors for the untriggered events. Hence, many of the untriggered events occurring within a few months of the triggered activity from SGR 1900 plus 14 are indeed consistent with the SGR source location, although Cygnus X-1 is also a good candidate.

  10. Challenges in Understanding and Predicting Greenland Lake Drainage Events

    NASA Astrophysics Data System (ADS)

    Poinar, K.; Andrews, L. C.; Moon, T. A.; Nowicki, S.

    2017-12-01

    To accurately predict ice flow, an ice-sheet model must resolve the complex spatio-temporal variability of the ice-sheet hydrologic system. For Greenland, this requires understanding rapid lake drainage events, by which moulins deliver water from supraglacial lakes to the ice-sheet base. Critical metrics include the drainage event location and its timing during the melt season. Here, we use multiple remote sensing datasets to investigate whether local principal strain rates control the dates of rapid supraglacial lake drainage events. We identify 359 rapid lake drainage events through a semi-automated analysis of MODIS and Landsat imagery, which we apply to Pâkitsoq, western Greenland, over nine summers (2006-2010 and 2013-2016). We compare these drainage dates to principal strain rates derived from InSAR (MEaSUREs and other products) and Landsat (GoLIVE and other products) satellite data over the same years. The InSAR-derived strain rates have lower uncertainties ( 0.01 yr-1) but capture only a wintertime average; the Landsat-derived strain rates have larger uncertainties ( 0.1 yr-1) but feature higher temporal resolution (≥16 days) and span the entire year, including the melt season. We find that locations with more-tensile wintertime strain rates are associated with earlier draining of supraglacial lakes in the subsequent summer. This is consistent with observations of lake drainage "clusters" or "cascades", where the perturbation from an initial lake drainage event is thought to trigger other lake drainages in the area. Our relation is not statistically significant, however, and any causality is complicated by a stronger correlation with more traditional metrics such as surface elevation and cumulative melt days. We also find that the Landsat-derived summertime strain rates, despite their higher temporal resolution, do not resolve the transient extensional strain rates known from GPS observations to accompany and/or incite rapid lake drainages. Our results

  11. Does the Nature of the Experience Influence Suggestibility? A Study of Children's Event Memory.

    ERIC Educational Resources Information Center

    Gobbo, Camilla; Mega, Carolina; Pipe, Margaret-Ellen

    2002-01-01

    Two experiments examined effects of event modality on young children's memory and suggestibility. Findings indicated that 5-year-olds were more accurate than 3-year-olds and those participating in the event were more accurate than those either observing or listening to a narrative. Assessment method, level of event learning, delay to testing, and…

  12. Enhancements to the Bayesian Infrasound Source Location Method

    DTIC Science & Technology

    2012-09-01

    ENHANCEMENTS TO THE BAYESIAN INFRASOUND SOURCE LOCATION METHOD Omar E. Marcillo, Stephen J. Arrowsmith, Rod W. Whitaker, and Dale N. Anderson Los...ABSTRACT We report on R&D that is enabling enhancements to the Bayesian Infrasound Source Location (BISL) method for infrasound event location...the Bayesian Infrasound Source Location Method 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER

  13. Assessment of User Home Location Geoinference Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Joshua J.; Bell, Eric B.; Corley, Courtney D.

    2015-05-29

    This study presents an assessment of multiple approaches to determine the home and/or other important locations to a Twitter user. In this study, we present a unique approach to the problem of geotagged data sparsity in social media when performing geoinferencing tasks. Given the sparsity of explicitly geotagged Twitter data, the ability to perform accurate and reliable user geolocation from a limited number of geotagged posts has proven to be quite useful. In our survey, we have achieved accuracy rates of over 86% in matching Twitter user profile locations with their inferred home locations derived from geotagged posts.

  14. Verb Aspect and the Activation of Event Knowledge

    PubMed Central

    Ferretti, Todd R.; Kutas, Marta; McRae, Ken

    2011-01-01

    The authors show that verb aspect influences the activation of event knowledge with 4 novel results. First, common locations of events (e.g., arena) are primed following verbs with imperfective aspect (e.g., was skating) but not verbs with perfect aspect (e.g., had skated). Second, people generate more locative prepositional phrases as completions to sentence fragments with imperfective than those with perfect aspect. Third, the amplitude of the N400 component to location nouns varies as a function of aspect and typicality, being smallest for imperfective sentences with highly expected locations and largest for imperfective sentences with less expected locations. Fourth, the amplitude of a sustained frontal negativity spanning prepositional phrases is larger following perfect than following imperfective aspect. Taken together, these findings suggest a dynamic interplay between event knowledge and the linguistic stream. PMID:17201561

  15. Microseismic Event Grouping Based on PageRank Linkage at the Newberry Volcano Geothermal Site

    NASA Astrophysics Data System (ADS)

    Aguiar, A. C.; Myers, S. C.

    2016-12-01

    The Newberry Volcano DOE FORGE site in Central Oregon has been stimulated two times using high-pressure fluid injection to study the Enhanced Geothermal Systems (EGS) technology. Several hundred microseismic events were generated during the first stimulation in the fall of 2012. Initial locations of this microseismicity do not show well defined subsurface structure in part because event location uncertainties are large (Foulger and Julian, 2013). We focus on this stimulation to explore the spatial and temporal development of microseismicity, which is key to understanding how subsurface stimulation modifies stress, fractures rock, and increases permeability. We use PageRank, Google's initial search algorithm, to determine connectivity within the events (Aguiar and Beroza, 2014) and assess signal-correlation topology for the micro-earthquakes. We then use this information to create signal families and compare these to the spatial and temporal proximity of associated earthquakes. We relocate events within families (identified by PageRank linkage) using the Bayesloc approach (Myers et al., 2007). Preliminary relocations show tight spatial clustering of event families as well as evidence of events relocating to a different cluster than originally reported. We also find that signal similarity (linkage) at several stations, not just one or two, is needed in order to determine that events are in close proximity to one another. We show that indirect linkage of signals using PageRank is a reliable way to increase the number of events that are confidently determined to be similar to one another, which may lead to efficient and effective grouping of earthquakes with similar physical characteristics, such as focal mechanisms and stress drop. Our ultimate goal is to determine whether changes in the state of stress and/or changes in the generation of subsurface fracture networks can be detected using PageRank topology as well as aid in the event relocation to obtain more accurate

  16. Cartan invariants and event horizon detection

    NASA Astrophysics Data System (ADS)

    Brooks, D.; Chavy-Waddy, P. C.; Coley, A. A.; Forget, A.; Gregoris, D.; MacCallum, M. A. H.; McNutt, D. D.

    2018-04-01

    We show that it is possible to locate the event horizon of a black hole (in arbitrary dimensions) by the zeros of certain Cartan invariants. This approach accounts for the recent results on the detection of stationary horizons using scalar polynomial curvature invariants, and improves upon them since the proposed method is computationally less expensive. As an application, we produce Cartan invariants that locate the event horizons for various exact four-dimensional and five-dimensional stationary, asymptotically flat (or (anti) de Sitter), black hole solutions and compare the Cartan invariants with the corresponding scalar curvature invariants that detect the event horizon.

  17. Detection performance of three different lightning location networks in Beijing area based on accurate fast antenna records

    NASA Astrophysics Data System (ADS)

    Srivastava, A.; Tian, Y.; Wang, D.; Yuan, S.; Chen, Z.; Sun, Z.; Qie, X.

    2016-12-01

    Scientists have developed the regional and worldwide lightning location network to study the lightning physics and locating the lightning stroke. One of the key issue in all the networks; to recognize the performance of the network. The performance of each network would be different based on the regional geographic conditions and the instrumental limitation. To improve the performance of the network. it is necessary to know the ground truth of the network and to discuss about the detection efficiency (DE) and location accuracy (LA). A comparative study has been discussed among World Wide Lightning Location Network (WWLLN), ADvanced TOA and Direction system (ADTD) and Beijing Lightning NETwork (BLNET) lightning detection network in Beijing area. WWLLN locate the cloud to ground (CG) and strong inter cloud (IC) globally without demonstrating any differences. ADTD locate the CG strokes in the entire China as regional. Both these networks are long range detection system that does not provide the focused details of a thunderstorm. BLNET can locate the CG and IC and is focused on thunderstorm detection. The waveform of fast antenna checked manually and the relative DE among the three networks has been obtained based on the CG strokes. The relative LA has been obtained using the matched flashes among these networks as well as LA obtained using the strike on the tower. The relative DE of BLNET is much higher than the ADTD and WWLLN as these networks has approximately similar relative DE. The relative LA of WWLLN and ADTD location is eastward and northward respectively from the BLNET. The LA based on tower observation is relatively high-quality in favor of BLNET. The ground truth of WWLLN, ADTD and BLNET has been obtained and found the performance of BLNET network is much better. This study is helpful to improve the performance of the networks and to provide a belief of LA that can follow the thunderstorm path with the prediction and forecasting of thunderstorm and

  18. Hydro-fractured reservoirs: A study using double-difference location techniques

    NASA Astrophysics Data System (ADS)

    Kahn, Dan Scott

    The mapping of induced seismicity in enhanced geothermal systems presents the best tool available for understanding the resulting hydro-fractured reservoir. In this thesis, two geothermal systems are studied; one in Krafla, Iceland and the other in Basel Switzerland. The purpose of the Krafla survey was to determine the relation between water injection into the fault system and the resulting earthquakes and fluid pressure in the subsurface crack system. The epicenters obtained from analyzing the seismic data gave a set of locations that are aligned along the border of a high resistivity zone ˜2500 meters below the injection well. Further magneto-telluric/seismic-data correlation was seen in the polarity of the cracks through shear wave splitting. The purpose of the Basel project was to examine the creation of a reservoir by the initial stimulation, using an injection well bored to 5000 meters. This stimulation triggered a M3.4 event, extending the normal range of event sizes commonly incurred in hydro-fractured reservoirs. To monitor the seismic activity 6 seismometer sondes were deployed at depths from 317 to 2740 meters below the ground surface. During the seven-day period over 13,000 events were recorded and approximately 3,300 located. These events were first located by single-difference techniques. Subsequently, after calculating their cross-correlation coefficients, clusters of events were relocated using a double-difference algorithm. The event locations support the existence of a narrow reservoir spreading form the injection well. Analysis of the seismic data indicates that the reservoir grew at a uniform rate punctuated by fluctuations which occurred at times of larger events, which were perhaps caused by sudden changes in pressure. The orientation and size of the main fracture plane was found by determining focal mechanisms and locating events that were similar to the M3.4 event. To address the question of whether smaller quakes are simply larger quakes

  19. Intelligent navigation and accurate positioning of an assist robot in indoor environments

    NASA Astrophysics Data System (ADS)

    Hua, Bin; Rama, Endri; Capi, Genci; Jindai, Mitsuru; Tsuri, Yosuke

    2017-12-01

    Intact robot's navigation and accurate positioning in indoor environments are still challenging tasks. Especially in robot applications, assisting disabled and/or elderly people in museums/art gallery environments. In this paper, we present a human-like navigation method, where the neural networks control the wheelchair robot to reach the goal location safely, by imitating the supervisor's motions, and positioning in the intended location. In a museum similar environment, the mobile robot starts navigation from various positions, and uses a low-cost camera to track the target picture, and a laser range finder to make a safe navigation. Results show that the neural controller with the Conjugate Gradient Backpropagation training algorithm gives a robust response to guide the mobile robot accurately to the goal position.

  20. The Challenges of On-Campus Recruitment Events

    ERIC Educational Resources Information Center

    McCoy, Amy

    2012-01-01

    On-campus admissions events are the secret weapon that colleges and universities use to convince students to apply and enroll. On-campus events vary depending on the size, location, and type of institution; they include campus visitations, open houses, preview days, scholarship events, admitted student events, and summer yield events. These events…

  1. Trust index based fault tolerant multiple event localization algorithm for WSNs.

    PubMed

    Xu, Xianghua; Gao, Xueyong; Wan, Jian; Xiong, Naixue

    2011-01-01

    This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP) localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  2. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    PubMed Central

    Xu, Xianghua; Gao, Xueyong; Wan, Jian; Xiong, Naixue

    2011-01-01

    This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP) localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms. PMID:22163972

  3. Invariance of wearing location of Omron-BI pedometers: a validation study.

    PubMed

    Zhu, Weimo; Lee, Miyoung

    2010-11-01

    The purpose of this study was to investigate the validity and reliability evidences of the Omron BI pedometer, which could count steps taken even when worn at different locations on the body. Forty (20 males and 20 females) adults were recruited to walk wearing 5 sets, 1 set at a time, of 10 BI pedometers during testing, 1 each at 10 different locations. For comparison, they also wore 2 Yamax Digi-Walker SW-200 pedometers and a Dynastream AMP 331 activity monitor. The subjects walked in 3 free-living conditions: a flat sidewalk, stairs, and mixed conditions. Except for a slight decrease in accuracy in the pant pocket locations, Omron BI pedometers counted steps accurately across other locations when subjects walked on the flat sidewalk, and the performance was consistent across devices and trials. When the subjects climbed up stairs, however, the absolute error % of the pant pocket locations increased significantly (P < .05) and similar or higher error rates were found in the AMP 331 and SW-200s. The Omron BI pedometer can accurately count steps when worn at various locations on the body in free-living conditions except for front pant pocket locations, especially when climbing stairs.

  4. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  5. Accurate Land Company, Inc., Acadia Subdivision, Plat 1 and Plat 2

    EPA Pesticide Factsheets

    The EPA is providing notice of an Administrative Penalty Assessment in the form of an Expedited Storm Water Settlement Agreement against Accurate Land Company, Inc., a business located at 12035 University Ave., Suite 100, Clive, IA 50235, for alleged viola

  6. System and Method of Locating Lightning Strikes

    NASA Technical Reports Server (NTRS)

    Medelius, Pedro J. (Inventor); Starr, Stanley O. (Inventor)

    2002-01-01

    A system and method of determining locations of lightning strikes has been described. The system includes multiple receivers located around an area of interest, such as a space center or airport. Each receiver monitors both sound and electric fields. The detection of an electric field pulse and a sound wave are used to calculate an area around each receiver in which the lighting is detected. A processor is coupled to the receivers to accurately determine the location of the lighting strike. The processor can manipulate the receiver data to compensate for environmental variables such as wind, temperature, and humidity. Further, each receiver processor can discriminate between distant and local lightning strikes.

  7. Locating waterfowl observations on aerial surveys

    USGS Publications Warehouse

    Butler, W.I.; Hodges, J.I.; Stehn, R.A.

    1995-01-01

    We modified standard aerial survey data collection to obtain the geographic location for each waterfowl observation on surveys in Alaska during 1987-1993. Using transect navigation with CPS (global positioning system), data recording on continuously running tapes, and a computer data input program, we located observations with an average deviation along transects of 214 m. The method provided flexibility in survey design and data analysis. Although developed for geese nesting near the coast of the Yukon-Kuskokwim Delta, the methods are widely applicable and were used on other waterfowl surveys in Alaska to map distribution and relative abundance of waterfowl. Accurate location data with GIS analysis and display may improve precision and usefulness of data from any aerial transect survey.

  8. Intrusion-Tolerant Location Information Services in Intelligent Vehicular Networks

    NASA Astrophysics Data System (ADS)

    Yan, Gongjun; Yang, Weiming; Shaner, Earl F.; Rawat, Danda B.

    Intelligent Vehicular Networks, known as Vehicle-to-Vehicle and Vehicle-to-Roadside wireless communications (also called Vehicular Ad hoc Networks), are revolutionizing our daily driving with better safety and more infortainment. Most, if not all, applications will depend on accurate location information. Thus, it is of importance to provide intrusion-tolerant location information services. In this paper, we describe an adaptive algorithm that detects and filters the false location information injected by intruders. Given a noisy environment of mobile vehicles, the algorithm estimates the high resolution location of a vehicle by refining low resolution location input. We also investigate results of simulations and evaluate the quality of the intrusion-tolerant location service.

  9. Verb Aspect and the Activation of Event Knowledge

    ERIC Educational Resources Information Center

    Ferretti, Todd R.; Kutas, Marta; McRae, Ken

    2007-01-01

    The authors show that verb aspect influences the activation of event knowledge with 4 novel results. First, common locations of events (e.g., arena) are primed following verbs with imperfective aspect (e.g., was skating) but not verbs with perfect aspect (e.g., had skated). Second, people generate more locative prepositional phrases as…

  10. Sudden Event Recognition: A Survey

    PubMed Central

    Suriani, Nor Surayahani; Hussain, Aini; Zulkifley, Mohd Asyraf

    2013-01-01

    Event recognition is one of the most active research areas in video surveillance fields. Advancement in event recognition systems mainly aims to provide convenience, safety and an efficient lifestyle for humanity. A precise, accurate and robust approach is necessary to enable event recognition systems to respond to sudden changes in various uncontrolled environments, such as the case of an emergency, physical threat and a fire or bomb alert. The performance of sudden event recognition systems depends heavily on the accuracy of low level processing, like detection, recognition, tracking and machine learning algorithms. This survey aims to detect and characterize a sudden event, which is a subset of an abnormal event in several video surveillance applications. This paper discusses the following in detail: (1) the importance of a sudden event over a general anomalous event; (2) frameworks used in sudden event recognition; (3) the requirements and comparative studies of a sudden event recognition system and (4) various decision-making approaches for sudden event recognition. The advantages and drawbacks of using 3D images from multiple cameras for real-time application are also discussed. The paper concludes with suggestions for future research directions in sudden event recognition. PMID:23921828

  11. Swift Gamma-Ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2004-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up narrow field instruments capable of multi-wavelength (UV, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space- based observatories drive the end-to-end data analysis and distribution requirements. The Swift mission is managed by the GSFC, and includes an international team of contributors that each bring their unique perspective that have proven invaluable to the mission. The spacecraft bus, provided by Spectrum Astro, Inc. was procured through a Rapid Spacecraft Development Office (RSDO) contract by the GSFC. There are three instruments: the Burst Alert Telescope (BAT) provided by the GSFC; the X-Ray Telescope (XRT) provided by a team led by the Pennsylvania State University (PSU); and the Ultra-Violet Optical Telescope (UVOT), again managed by PSU. The Mission Operations Center (MOC) was developed by and is located at PSU. Science archiving and data analysis centers are located at the GSFC, in the UK and in Italy.

  12. Mixed-Mode Slip Behavior of the Altotiberina Low-Angle Normal Fault System (Northern Apennines, Italy) through High-Resolution Earthquake Locations and Repeating Events

    NASA Astrophysics Data System (ADS)

    Valoroso, Luisa; Chiaraluce, Lauro; Di Stefano, Raffaele; Monachesi, Giancarlo

    2017-12-01

    We generated a 4.5-year-long (2010-2014) high-resolution earthquake catalogue, composed of 37,000 events with ML < 3.9 and MC = 0.5 completeness magnitude, to report on the seismic activity of the Altotiberina (ATF) low-angle normal fault system and to shed light on the mechanical behavior and seismic potential of this fault, which is capable of generating a M7 event. Seismicity defines the geometry of the fault system composed of the low-angle (15°-20°) ATF, extending for 50 km along strike and between 4 and 16 km at depth showing an 1.5 km thick fault zone made of multiple subparallel slipping planes, and a complex network of synthetic/antithetic higher-angle segments located in the ATF hanging wall (HW) that can be traced along strike for up to 35 km. Ninety percent of the recorded seismicity occurs along the high-angle HW faults during a series of minor, sometimes long-lasting (months) seismic sequences with multiple MW3+ mainshocks. Remaining earthquakes (ML < 2.4) are released instead along the low-angle ATF at a constant rate of 2.2 events per day. Within the ATF-related seismicity, we found 97 clusters of repeating earthquakes (RE), mostly consisting of doublets occurring during short interevent time (hours). RE are located within the geodetically recognized creeping portions of the ATF, around the main locked asperity. The rate of occurrence of RE seems quite synchronous with the ATF-HW seismic release, suggesting that creeping may guide the strain partitioning in the ATF system. The seismic moment released by the ATF seismicity accounts for 30% of the geodetic one, implying aseismic deformation. The ATF-seismicity pattern is thus consistent with a mixed-mode (seismic and aseismic) slip behavior.

  13. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-12-15

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  14. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  15. Magnitudes and locations of the 1811-1812 New Madrid, Missouri, and the 1886 Charleston, South Carolina, earthquakes

    USGS Publications Warehouse

    Bakun, W.H.; Hopper, M.G.

    2004-01-01

    We estimate locations and moment magnitudes M and their uncertainties for the three largest events in the 1811-1812 sequence near New Madrid, Missouri, and for the 1 September 1886 event near Charleston, South Carolina. The intensity magnitude M1, our preferred estimate of M, is 7.6 for the 16 December 1811 event that occurred in the New Madrid seismic zone (NMSZ) on the Bootheel lineament or on the Blytheville seismic zone. M1, is 7.5 for the 23 January 1812 event for a location on the New Madrid north zone of the NMSZ and 7.8 for the 7 February 1812 event that occurred on the Reelfoot blind thrust of the NMSZ. Our preferred locations for these events are located on those NMSZ segments preferred by Johnston and Schweig (1996). Our estimates of M are 0.1-0.4 M units less than those of Johnston (1996b) and 0.3-0.5 M units greater than those of Hough et al. (2000). M1 is 6.9 for the 1 September 1886 event for a location at the Summerville-Middleton Place cluster of recent small earthquakes located about 30 km northwest of Charleston.

  16. Observing Triggered Earthquakes Across Iran with Calibrated Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Karasozen, E.; Bergman, E.; Ghods, A.; Nissen, E.

    2016-12-01

    We investigate earthquake triggering phenomena in Iran by analyzing patterns of aftershock activity around mapped surface ruptures. Iran has an intense level of seismicity (> 40,000 events listed in the ISC Bulletin since 1960) due to it accommodating a significant portion of the continental collision between Arabia and Eurasia. There are nearly thirty mapped surface ruptures associated with earthquakes of M 6-7.5, mostly in eastern and northwestern Iran, offering a rich potential to study the kinematics of earthquake nucleation, rupture propagation, and subsequent triggering. However, catalog earthquake locations are subject to up to 50 km of location bias from the combination of unknown Earth structure and unbalanced station coverage, making it challenging to assess both the rupture directivity of larger events and the spatial patterns of their aftershocks. To overcome this limitation, we developed a new two-tiered multiple-event relocation approach to obtain hypocentral parameters that are minimally biased and have realistic uncertainties. In the first stage, locations of small clusters of well-recorded earthquakes at local spatial scales (100s of events across 100 km length scales) are calibrated either by using near-source arrival times or independent location constraints (e.g. local aftershock studies, InSAR solutions), using an implementation of the Hypocentroidal Decomposition relocation technique called MLOC. Epicentral uncertainties are typically less than 5 km. Then, these events are used as prior constraints in the code BayesLoc, a Bayesian relocation technique that can handle larger datasets, to yield region-wide calibrated hypocenters (1000s of events over 1000 km length scales). With locations and errors both calibrated, the pattern of aftershock activity can reveal the type of the earthquake triggering: dynamic stress changes promote an increase in the seismicity rate in the direction of unilateral propagation, whereas static stress changes should

  17. Chronology: MSFC Space Station program, 1982 - present. Major events

    NASA Technical Reports Server (NTRS)

    Whalen, Jessie E. (Compiler); Mckinley, Sarah L. (Compiler); Gates, Thomas G. (Compiler)

    1988-01-01

    The Marshall Space Flight Center (MSFC) maintains an active program to capture historical information and documentation on the MSFC's roles regarding Space Shuttle and Space Station. Marshall History Report 12, called Chronology: MSFC Space Station Program, 1982-Present, is presented. It contains synopses of major events listed according to the dates of their occurrence. Indices follow the synopses and provide additional data concerning the events listed. The Event Index provides a brief listing of all the events without synopses. The Element Index lists the specific elements of the Space Station Program under consideration in the events. The Location Index lists the locations where the events took place. The indices and synopses may be cross-referenced by using dates.

  18. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.

    2005-12-01

    Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.

  19. Researchermap: a tool for visualizing author locations using Google maps.

    PubMed

    Rastegar-Mojarad, Majid; Bales, Michael E; Yu, Hong

    2013-01-01

    We hereby present ResearcherMap, a tool to visualize locations of authors of scholarly papers. In response to a query, the system returns a map of author locations. To develop the system we first populated a database of author locations, geocoding institution locations for all available institutional affiliation data in our database. The database includes all authors of Medline papers from 1990 to 2012. We conducted a formative heuristic usability evaluation of the system and measured the system's accuracy and performance. The accuracy of finding the accurate address is 97.5% in our system.

  20. Formation events of shoreline sand waves on a gravel beach

    NASA Astrophysics Data System (ADS)

    Arriaga, Jaime; Falqués, Albert; Ribas, Francesca; Crews, Eddie

    2018-06-01

    Kilometric-scale shoreline sand waves (KSSW) have been observed in the north-east flank of the Dungeness Cuspate Foreland (southeastern coast of the UK). They consist of two bumps separated by embayments with a 350-450-m spacing. We have analysed 36 shoreline surveys of 2-km length using the Discrete Fourier Transformation (DFT), from 2005 to 2016, and seven topographic surveys encompassing the intertidal zone, from 2010 to 2016. The data set shows two clear formation events. In order to test the role of high-angle waves on the KSSW formation, the 10-year wave series is propagated from the wave buoy located at 43 m depth up to a location in front of the undulations at 4 m depth using the SWAN wave model. The dominating SW waves arrive with a very high incidence angle (˜ 80°) while the NE waves arrive almost shore normal. The ratio R, which measures the degree of dominance of high-angle waves with respect to low-angle waves, correlates well with the shoreline DFT magnitude values of the observed wavelength undulations. In particular, the highest R values coincide with the formation events. Finally, a linear stability model based on the one-line approximation is applied to the Dungeness profile and the 10-year propagated wave series. It predicts accurately the formation moments, with positive growth rates in the correct order of magnitude for wavelengths similar to the observed ones. All these results confirm that the shoreline undulations in Dungeness are self-organized and that the underlying formation mechanism is the high-angle wave instability. The two detected formation events provide a unique opportunity to validate the existing morphodynamic models that include such instability.

  1. Formation events of shoreline sand waves on a gravel beach

    NASA Astrophysics Data System (ADS)

    Arriaga, Jaime; Falqués, Albert; Ribas, Francesca; Crews, Eddie

    2018-05-01

    Kilometric-scale shoreline sand waves (KSSW) have been observed in the north-east flank of the Dungeness Cuspate Foreland (southeastern coast of the UK). They consist of two bumps separated by embayments with a 350-450-m spacing. We have analysed 36 shoreline surveys of 2-km length using the Discrete Fourier Transformation (DFT), from 2005 to 2016, and seven topographic surveys encompassing the intertidal zone, from 2010 to 2016. The data set shows two clear formation events. In order to test the role of high-angle waves on the KSSW formation, the 10-year wave series is propagated from the wave buoy located at 43 m depth up to a location in front of the undulations at 4 m depth using the SWAN wave model. The dominating SW waves arrive with a very high incidence angle (˜ 80°) while the NE waves arrive almost shore normal. The ratio R, which measures the degree of dominance of high-angle waves with respect to low-angle waves, correlates well with the shoreline DFT magnitude values of the observed wavelength undulations. In particular, the highest R values coincide with the formation events. Finally, a linear stability model based on the one-line approximation is applied to the Dungeness profile and the 10-year propagated wave series. It predicts accurately the formation moments, with positive growth rates in the correct order of magnitude for wavelengths similar to the observed ones. All these results confirm that the shoreline undulations in Dungeness are self-organized and that the underlying formation mechanism is the high-angle wave instability. The two detected formation events provide a unique opportunity to validate the existing morphodynamic models that include such instability.

  2. Seismic and Infrasound Location

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arrowsmith, Stephen J.; Begnaud, Michael L.

    2014-03-19

    This presentation includes slides on Signal Propagation Through the Earth/Atmosphere Varies at Different Scales; 3D Seismic Models: RSTT; Ray Coverage (Pn); Source-Specific Station Corrections (SSSCs); RSTT Conclusions; SALSA3D (SAndia LoS Alamos) Global 3D Earth Model for Travel Time; Comparison of IDC SSSCs to RSTT Predictions; SALSA3D; Validation and Model Comparison; DSS Lines in the Siberian Platform; DSS Line CRA-4 Comparison; Travel Time Δak135; Travel Time Prediction Uncertainty; SALSA3D Conclusions; Infrasound Data Processing: An example event; Infrasound Data Processing: An example event; Infrasound Location; How does BISL work?; BISL: Application to the 2013 DPRK Test; and BISL: Ongoing Research.

  3. An Improved Source-Scanning Algorithm for Locating Earthquake Clusters or Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Liao, Y.; Kao, H.; Hsu, S.

    2010-12-01

    The Source-scanning Algorithm (SSA) was originally introduced in 2004 to locate non-volcanic tremors. Its application was later expanded to the identification of earthquake rupture planes and the near-real-time detection and monitoring of landslides and mud/debris flows. In this study, we further improve SSA for the purpose of locating earthquake clusters or aftershock sequences when only a limited number of waveform observations are available. The main improvements include the application of a ground motion analyzer to separate P and S waves, the automatic determination of resolution based on the grid size and time step of the scanning process, and a modified brightness function to utilize constraints from multiple phases. Specifically, the improved SSA (named as ISSA) addresses two major issues related to locating earthquake clusters/aftershocks. The first one is the massive amount of both time and labour to locate a large number of seismic events manually. And the second one is to efficiently and correctly identify the same phase across the entire recording array when multiple events occur closely in time and space. To test the robustness of ISSA, we generate synthetic waveforms consisting of 3 separated events such that individual P and S phases arrive at different stations in different order, thus making correct phase picking nearly impossible. Using these very complicated waveforms as the input, the ISSA scans all model space for possible combination of time and location for the existence of seismic sources. The scanning results successfully associate various phases from each event at all stations, and correctly recover the input. To further demonstrate the advantage of ISSA, we apply it to the waveform data collected by a temporary OBS array for the aftershock sequence of an offshore earthquake southwest of Taiwan. The overall signal-to-noise ratio is inadequate for locating small events; and the precise arrival times of P and S phases are difficult to

  4. Accurate beacon positioning method for satellite-to-ground optical communication.

    PubMed

    Wang, Qiang; Tong, Ling; Yu, Siyuan; Tan, Liying; Ma, Jing

    2017-12-11

    In satellite laser communication systems, accurate positioning of the beacon is essential for establishing a steady laser communication link. For satellite-to-ground optical communication, the main influencing factors on the acquisition of the beacon are background noise and atmospheric turbulence. In this paper, we consider the influence of background noise and atmospheric turbulence on the beacon in satellite-to-ground optical communication, and propose a new locating algorithm for the beacon, which takes the correlation coefficient obtained by curve fitting for image data as weights. By performing a long distance laser communication experiment (11.16 km), we verified the feasibility of this method. Both simulation and experiment showed that the new algorithm can accurately obtain the position of the centroid of beacon. Furthermore, for the distortion of the light spot through atmospheric turbulence, the locating accuracy of the new algorithm was 50% higher than that of the conventional gray centroid algorithm. This new approach will be beneficial for the design of satellite-to ground optical communication systems.

  5. A new method to estimate location and slip of simulated rock failure events

    NASA Astrophysics Data System (ADS)

    Heinze, Thomas; Galvan, Boris; Miller, Stephen Andrew

    2015-05-01

    At the laboratory scale, identifying and locating acoustic emissions (AEs) is a common method for short term prediction of failure in geomaterials. Above average AE typically precedes the failure process and is easily measured. At larger scales, increase in micro-seismic activity sometimes precedes large earthquakes (e.g. Tohoku, L'Aquilla, oceanic transforms), and can be used to assess seismic risk. The goal of this work is to develop a methodology and numerical algorithms for extracting a measurable quantity analogous to AE arising from the solution of equations governing rock deformation. Since there is no physical property to quantify AE derivable from the governing equations, an appropriate rock-mechanical analog needs to be found. In this work, we identify a general behavior of the AE generation process preceding rock failure. This behavior includes arbitrary localization of low magnitude events during pre-failure stage, followed by increase in number and amplitude, and finally localization around the incipient failure plane during macroscopic failure. We propose deviatoric strain rate as the numerical analog that mimics this behavior, and develop two different algorithms designed to detect rapid increases in deviatoric strain using moving averages. The numerical model solves a fully poro-elasto-plastic continuum model and is coupled to a two-phase flow model. We test our model by comparing simulation results with experimental data of drained compression and of fluid injection experiments. We find for both cases that occurrence and amplitude of our AE analog mimic the observed general behavior of the AE generation process. Our technique can be extended to modeling at the field scale, possibly providing a mechanistic basis for seismic hazard assessment from seismicity that occasionally precedes large earthquakes.

  6. Contribution of Infrasound to IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif Mohamed; Medinskaya, Tatiana; Mialle, Pierrick

    2016-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003 but infrasound detections could not be used by the Global Association (GA) Software due to the unmanageable high number of spurious associations. Offline improvements of the automatic processing took place to reduce the number of false detections to a reasonable level. In February 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts and large earthquakes belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. Presence of infrasound detection associated to an event from a mining area indicates a surface explosion. Satellite imaging and a database of active mines can be used to confirm the origin of such events. This presentation will summarize the contribution of 6 years of infrasound data to IDC bulletins and provide examples of events recorded at the IMS infrasound network. Results of this study may help to improve location of small events with observations on infrasound stations.

  7. Multi-Array Detection, Association and Location of Infrasound and Seismo-Acoustic Events in Utah

    DTIC Science & Technology

    2008-09-30

    techniques for detecting , associating, and locating infrasound signals at single and multiple arrays and then combining the processed results with...was detected and located by both infrasound and seismic instruments (Figure 3). Infrasound signals at all three arrays , from one of the explosions, are...COVERED (From - To) 30-Sep-2008 REPRINT 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER MULTI- ARRAY DETECTION , ASSOCIATION AND LOCATION OF INFRASOUND FA8718

  8. A novel method to accurately locate and count large numbers of steps by photobleaching

    PubMed Central

    Tsekouras, Konstantinos; Custer, Thomas C.; Jashnsaz, Hossein; Walter, Nils G.; Pressé, Steve

    2016-01-01

    Photobleaching event counting is a single-molecule fluorescence technique that is increasingly being used to determine the stoichiometry of protein and RNA complexes composed of many subunits in vivo as well as in vitro. By tagging protein or RNA subunits with fluorophores, activating them, and subsequently observing as the fluorophores photobleach, one obtains information on the number of subunits in a complex. The noise properties in a photobleaching time trace depend on the number of active fluorescent subunits. Thus, as fluorophores stochastically photobleach, noise properties of the time trace change stochastically, and these varying noise properties have created a challenge in identifying photobleaching steps in a time trace. Although photobleaching steps are often detected by eye, this method only works for high individual fluorophore emission signal-to-noise ratios and small numbers of fluorophores. With filtering methods or currently available algorithms, it is possible to reliably identify photobleaching steps for up to 20–30 fluorophores and signal-to-noise ratios down to ∼1. Here we present a new Bayesian method of counting steps in photobleaching time traces that takes into account stochastic noise variation in addition to complications such as overlapping photobleaching events that may arise from fluorophore interactions, as well as on-off blinking. Our method is capable of detecting ≥50 photobleaching steps even for signal-to-noise ratios as low as 0.1, can find up to ≥500 steps for more favorable noise profiles, and is computationally inexpensive. PMID:27654946

  9. Enhancement and identification of dust events in the south-west region of Iran using satellite observations

    NASA Astrophysics Data System (ADS)

    Taghavi, F.; Owlad, E.; Ackerman, S. A.

    2017-03-01

    South-west Asia including the Middle East is one of the most prone regions to dust storm events. In recent years, there was an increase in the occurrence of these environmental and meteorological phenomena. Remote sensing could serve as an applicable method to detect and also characterise these events. In this study, two dust enhancement algorithms were used to investigate the behaviour of dust events using satellite data, compare with numerical model output and other satellite products and finally validate with in-situ measurements. The results show that the use of thermal infrared algorithm enhances dust more accurately. The aerosol optical depth from MODIS and output of a Dust Regional Atmospheric Model (DREAM8b) are applied for comparing the results. Ground-based observations of synoptic stations and sun photometers are used for validating the satellite products. To find the transport direction and the locations of the dust sources and the synoptic situations during these events, model outputs (HYSPLIT and NCEP/NCAR) are presented. Comparing the results with synoptic maps and the model outputs showed that using enhancement algorithms is a more reliable way than any other MODIS products or model outputs to enhance the dust.

  10. Locating and Modeling Regional Earthquakes with Broadband Waveform Data

    NASA Astrophysics Data System (ADS)

    Tan, Y.; Zhu, L.; Helmberger, D.

    2003-12-01

    Retrieving source parameters of small earthquakes (Mw < 4.5), including mechanism, depth, location and origin time, relies on local and regional seismic data. Although source characterization for such small events achieves a satisfactory stage in some places with a dense seismic network, such as TriNet, Southern California, a worthy revisit to the historical events in these places or an effective, real-time investigation of small events in many other places, where normally only a few local waveforms plus some short-period recordings are available, is still a problem. To address this issue, we introduce a new type of approach that estimates location, depth, origin time and fault parameters based on 3-component waveform matching in terms of separated Pnl, Rayleigh and Love waves. We show that most local waveforms can be well modeled by a regionalized 1-D model plus different timing corrections for Pnl, Rayleigh and Love waves at relatively long periods, i.e., 4-100 sec for Pnl, and 8-100 sec for surface waves, except for few anomalous paths involving greater structural complexity, meanwhile, these timing corrections reveal similar azimuthal patterns for well-located cluster events, despite their different focal mechanisms. Thus, we can calibrate the paths separately for Pnl, Rayleigh and Love waves with the timing corrections from well-determined events widely recorded by a dense modern seismic network or a temporary PASSCAL experiment. In return, we can locate events and extract their fault parameters by waveform matching for available waveform data, which could be as less as from two stations, assuming timing corrections from the calibration. The accuracy of the obtained source parameters is subject to the error carried by the events used for the calibration. The detailed method requires a Green­_s function library constructed from a regionalized 1-D model together with necessary calibration information, and adopts a grid search strategy for both hypercenter and

  11. Hance_WFSR flasher locations

    EPA Pesticide Factsheets

    This entry contains two files. The first file, Hance_WFSR Flasher locations.xlxs, contains information describing the location of installed landmark 'flashers' consisting of 2 square aluminum metal tags. Each tag was inscribed with a number to aid field personnel in the identification of landmark location within the West Fork Smith River watershed in southern coastal Oregon. These landmarks were used to calculate stream distances between points in the watershed, including distances between tagging locations and detection events for tagged fish. A second file, named Hance_fish_detection_data1.xlxs contains information on the detection of tagged fish within the West Fork Smith River stream network. The file includes both the location where the fish were tagged and where they were subsequently detected. Together with the information in the WFSR flasher location dataset, these data allow estimation of the minimum distances and directions moved by juvenile coho salmon during the fall transition period.A map locator is provided in Figure 1 in the accompanying manuscript: Dalton J. Hance, Lisa M. Ganio, Kelly M. Burnett & Joseph L. Ebersole (2016) Basin-Scale Variation in the Spatial Pattern of Fall Movement of Juvenile Coho Salmon in the West Fork Smith River, Oregon, Transactions of the American Fisheries Society, 145:5, 1018-1034, DOI: 10.1080/00028487.2016.1194892This dataset is associated with the following publication:Hance, D.J., L.M. Ganio, K.M. Burnett, an

  12. Holographic Location of Distant Points (PREPRINT)

    DTIC Science & Technology

    2010-06-01

    respects and the nonimaging systems have significant advantages. This paper shows how to use holograms to construct a flat, solid, small, accurate, small... nonimaging point location system. 15. SUBJECT TERMS imagery, holographic 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT: SAR 18...respects and the nonimaging systems we have discussed earlier (1, 2) have significant advantages. This paper shows how to use holograms to construct a

  13. Underwater hydrophone location survey

    NASA Technical Reports Server (NTRS)

    Cecil, Jack B.

    1993-01-01

    The Atlantic Undersea Test and Evaluation Center (AUTEC) is a U.S. Navy test range located on Andros Island, Bahamas, and a Division of the Naval Undersea Warfare Center (NUWC), Newport, RI. The Headquarters of AUTEC is located at a facility in West Palm Beach, FL. AUTEC's primary mission is to provide the U.S. Navy with a deep-water test and evaluation facility for making underwater acoustic measurements, testing and calibrating sonars, and providing accurate underwater, surface, and in-air tracking data on surface ships, submarines, aircraft, and weapon systems. Many of these programs are in support of Antisubmarine Warfare (ASW), undersea research and development programs, and Fleet assessment and operational readiness trials. Most tests conducted at AUTEC require precise underwater tracking (plus or minus 3 yards) of multiple acoustic signals emitted with the correct waveshape and repetition criteria from either a surface craft or underwater vehicle.

  14. Hunting for shallow slow-slip events at Cascadia

    NASA Astrophysics Data System (ADS)

    Tan, Y. J.; Bletery, Q.; Fan, W.; Janiszewski, H. A.; Lynch, E.; McCormack, K. A.; Phillips, N. J.; Rousset, B.; Seyler, C.; French, M. E.; Gaherty, J. B.; Regalla, C.

    2017-12-01

    The discovery of slow earthquakes at subduction zones is one of the major breakthroughs of Earth science in the last two decades. Slow earthquakes involve a wide spectrum of fault slip behaviors and seismic radiation patterns, such as tremor, low-frequency earthquakes, and slow-slip events. The last of these are particularly interesting due to their large moment releases accompanied by minimal ground shaking. Slow-slip events have been reported at various subduction zones ; most of these slow-slip events are located down-dip of the megathrust seismogenic zone, while a few up-dip cases have recently been observed at Nankai and New Zealand. Up-dip slow-slip events illuminate the structure of faulting environments and rupture mechanisms of tsunami earthquakes. Their possible presence and location at a particular subduction zone can help assess earthquake and tsunami hazard for that region. However, their typical location distant from the coast requires the development of techniques using offshore instrumentation. Here, we investigate the absolute pressure gauges (APG) of the Cascadia Initiative, a four year amphibious seismic experiment, to search for possible shallow up-dip slow-slip events in the Cascadia subduction zone. These instruments are collocated with ocean bottom seismometers (OBS) and located close to buoys and onshore GPS stations, offering the opportunity to investigate the utility of multiple datasets. Ultimately, we aim to develop a protocol to analyze APG data for offshore shallow slow-slip event detections and quantify uncertainties, with direct applications to understanding the up-dip subduction interface system in Cascadia.

  15. The role of suspension events in cross-shore and longshore suspended sediment transport in the surf zone

    USGS Publications Warehouse

    Jaffe, Bruce E.

    2015-01-01

    Suspension of sand in the surf zone is intermittent. Especially striking in a time series of concentration are periods of intense suspension, suspension events, when the water column suspended sediment concentration is an order of magnitude greater than the mean concentration. The prevalence, timing, and contribution of suspension events to cross-shore and longshore suspended sediment transport are explored using field data collected in the inner half of the surf zone during a large storm at Duck, NC. Suspension events are defined as periods when the concentration is above a threshold. Events tended to occur during onshore flow under the wave crest, resulting in an onshore contribution to the suspended sediment transport. Even though large events occurred less than 10 percent of the total time, at some locations onshore transport associated with suspension events was greater than mean-current driven offshore-directed transport during non-event periods, causing the net suspended sediment transport to be onshore. Events and fluctuations in longshore velocity were not correlated. However, events did increase the longshore suspended sediment transport by approximately the amount they increase the mean concentration, which can be up to 35%. Because of the lack of correlation, the longshore suspended sediment transport can be modeled without considering the details of the intensity and time of events as the vertical integration of the product of the time-averaged longshore velocity and an event-augmented time-averaged concentration. However, to accurately model cross-shore suspended sediment transport, the timing and intensity of suspension events must be reproduced.

  16. The status of accurately locating forest inventory and analysis plots using the Global Positioning System

    Treesearch

    Michael Hoppus; Andrew Lister

    2007-01-01

    Historically, field crews used Global Positioning System (GPS) coordinates to establish and relocate plots, as well as document their general location. During the past 5 years, the increase in Geographic Information System (GIS) capabilities and in customer requests to use the spatial relationships between Forest Inventory and Analysis (FIA) plot data and other GIS...

  17. Financial impact of inaccurate Adverse Event recording post Hip Fracture surgery: Addendum to 'Adverse event recording post hip fracture surgery'.

    PubMed

    Lee, Matthew J; Doody, Kevin; Mohamed, Khalid M S; Butler, Audrey; Street, John; Lenehan, Brian

    2018-02-15

    A study in 2011 by (Doody et al. Ir Med J 106(10):300-302, 2013) looked at comparing inpatient adverse events recorded prospectively at the point of care, with adverse events recorded by the national Hospital In-Patient Enquiry (HIPE) System. In the study, a single-centre University Hospital in Ireland treating acute hip fractures in an orthopaedic unit recorded 39 patients over a 2-month (August-September 2011) period, with 55 adverse events recorded prospectively in contrast to the HIPE record of 13 (23.6%) adverse events. With the recent change in the Irish hospital funding model from block grant to an 'activity-based funding' on the basis of case load and case complexity, the hospital financial allocation is dependent on accurate case complexity coding. A retrospective assessment of the financial implications of the two methods of adverse incident recording was carried out. A total of €39,899 in 'missed funding' for 2 months was calculated when the ward-based, prospectively collected data was compared to the national HIPE data. Accurate data collection is paramount in facilitating activity-based funding, to improve patient care and ensure the appropriate allocation of resources.

  18. Close binding of identity and location in visual feature perception

    NASA Technical Reports Server (NTRS)

    Johnston, J. C.; Pashler, H.

    1990-01-01

    The binding of identity and location information in disjunctive feature search was studied. Ss searched a heterogeneous display for a color or a form target, and reported both target identity and location. To avoid better than chance guessing of target identity (by choosing the target less likely to have been seen), the difficulty of the two targets was equalized adaptively; a mathematical model was used to quantify residual effects. A spatial layout was used that minimized postperceptual errors in reporting location. Results showed strong binding of identity and location perception. After correction for guessing, no perception of identity without location was found. A weak trend was found for accurate perception of target location without identity. We propose that activated features generate attention-calling "interrupt" signals, specifying only location; attention then retrieves the properties at that location.

  19. Joint Inversion of Source Location and Source Mechanism of Induced Microseismics

    NASA Astrophysics Data System (ADS)

    Liang, C.

    2014-12-01

    Seismic source mechanism is a useful property to indicate the source physics and stress and strain distribution in regional, local and micro scales. In this study we jointly invert source mechanisms and locations for microseismics induced in fluid fracturing treatment in the oil and gas industry. For the events that are big enough to see waveforms, there are quite a few techniques can be applied to invert the source mechanism including waveform inversion, first polarity inversion and many other methods and variants based on these methods. However, for events that are too small to identify in seismic traces such as the microseismics induced by the fluid fracturing in the Oil and Gas industry, a source scanning algorithms (SSA for short) with waveform stacking are usually applied. At the same time, a joint inversion of location and source mechanism are possible but at a cost of high computation budget. The algorithm is thereby called Source Location and Mechanism Scanning Algorithm, SLMSA for short. In this case, for given velocity structure, all possible combinations of source locations (X,Y and Z) and source mechanism (Strike, Dip and Rake) are used to compute travel-times and polarities of waveforms. Correcting Normal moveout times and polarities, and stacking all waveforms, the (X, Y, Z , strike, dip, rake) combination that gives the strongest stacking waveform is identified as the solution. To solve the problem of high computation problem, CPU-GPU programing is applied. Numerical datasets are used to test the algorithm. The SLMSA has also been applied to a fluid fracturing datasets and reveal several advantages against the location only method: (1) for shear sources, the source only program can hardly locate them because of the canceling out of positive and negative polarized traces, but the SLMSA method can successfully pick up those events; (2) microseismic locations alone may not be enough to indicate the directionality of micro-fractures. The statistics of

  20. Automatic detection and notification of "wrong patient-wrong location'' errors in the operating room.

    PubMed

    Sandberg, Warren S; Häkkinen, Matti; Egan, Marie; Curran, Paige K; Fairbrother, Pamela; Choquette, Ken; Daily, Bethany; Sarkka, Jukka-Pekka; Rattner, David

    2005-09-01

    When procedures and processes to assure patient location based on human performance do not work as expected, patients are brought incrementally closer to a possible "wrong patient-wrong procedure'' error. We developed a system for automated patient location monitoring and management. Real-time data from an active infrared/radio frequency identification tracking system provides patient location data that are robust and can be compared with an "expected process'' model to automatically flag wrong-location events as soon as they occur. The system also generates messages that are automatically sent to process managers via the hospital paging system, thus creating an active alerting function to annunciate errors. We deployed the system to detect and annunciate "patient-in-wrong-OR'' events. The system detected all "wrong-operating room (OR)'' events, and all "wrong-OR'' locations were correctly assigned within 0.50+/-0.28 minutes (mean+/-SD). This corresponded to the measured latency of the tracking system. All wrong-OR events were correctly annunciated via the paging function. This experiment demonstrates that current technology can automatically collect sufficient data to remotely monitor patient flow through a hospital, provide decision support based on predefined rules, and automatically notify stakeholders of errors.

  1. A PC-based computer package for automatic detection and location of earthquakes: Application to a seismic network in eastern sicity (Italy)

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano

    Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the

  2. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  3. Prosthetic Complications and Maintenance Requirements in Locator-attached Implant-Supported Overdentures: A Retrospective Study.

    PubMed

    Engelhardt, Frank; Zeman, Florian; Behr, Michael; Hahmel, Sebastian

    2016-03-01

    Retrospective data of 32 patients supplied with implant-supported and Locator-attached overdentures were screened for prosthetic complications and maintenance requirements, which were recorded and statistically analyzed. Mean observation time was 4.78 ± 1.72) years. Loss of retention was the most frequently observed event (n = 22). Damage and exchange of the insert holders (n = 4) and loosening of locator attachments (n = 2) and fracture of the insert holder (n = 2) were uncommon events; no loss of locator attachments was observed. Loss of retention in Locator-attached overdentures is frequent; correlating patient-specific parameters with prosthetic complications is necessary to define recommendations for the use of Locator attachments.

  4. Grading dermatologic adverse events of cancer treatments: the Common Terminology Criteria for Adverse Events Version 4.0.

    PubMed

    Chen, Alice P; Setser, Ann; Anadkat, Milan J; Cotliar, Jonathan; Olsen, Elise A; Garden, Benjamin C; Lacouture, Mario E

    2012-11-01

    Dermatologic adverse events to cancer therapies have become more prevalent and may to lead to dose modifications or discontinuation of life-saving or prolonging treatments. This has resulted in a new collaboration between oncologists and dermatologists, which requires accurate cataloging and grading of side effects. The Common Terminology Criteria for Adverse Events Version 4.0 is a descriptive terminology and grading system that can be used for uniform reporting of adverse events. A proper understanding of this standardized classification system is essential for dermatologists to properly communicate with all physicians caring for patients with cancer. Copyright © 2012 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.

  5. Muver, a computational framework for accurately calling accumulated mutations.

    PubMed

    Burkholder, Adam B; Lujan, Scott A; Lavender, Christopher A; Grimm, Sara A; Kunkel, Thomas A; Fargo, David C

    2018-05-09

    Identification of mutations from next-generation sequencing data typically requires a balance between sensitivity and accuracy. This is particularly true of DNA insertions and deletions (indels), that can impart significant phenotypic consequences on cells but are harder to call than substitution mutations from whole genome mutation accumulation experiments. To overcome these difficulties, we present muver, a computational framework that integrates established bioinformatics tools with novel analytical methods to generate mutation calls with the extremely low false positive rates and high sensitivity required for accurate mutation rate determination and comparison. Muver uses statistical comparison of ancestral and descendant allelic frequencies to identify variant loci and assigns genotypes with models that include per-sample assessments of sequencing errors by mutation type and repeat context. Muver identifies maximally parsimonious mutation pathways that connect these genotypes, differentiating potential allelic conversion events and delineating ambiguities in mutation location, type, and size. Benchmarking with a human gold standard father-son pair demonstrates muver's sensitivity and low false positive rates. In DNA mismatch repair (MMR) deficient Saccharomyces cerevisiae, muver detects multi-base deletions in homopolymers longer than the replicative polymerase footprint at rates greater than predicted for sequential single-base deletions, implying a novel multi-repeat-unit slippage mechanism. Benchmarking results demonstrate the high accuracy and sensitivity achieved with muver, particularly for indels, relative to available tools. Applied to an MMR-deficient Saccharomyces cerevisiae system, muver mutation calls facilitate mechanistic insights into DNA replication fidelity.

  6. A novel method to accurately locate and count large numbers of steps by photobleaching.

    PubMed

    Tsekouras, Konstantinos; Custer, Thomas C; Jashnsaz, Hossein; Walter, Nils G; Pressé, Steve

    2016-11-07

    Photobleaching event counting is a single-molecule fluorescence technique that is increasingly being used to determine the stoichiometry of protein and RNA complexes composed of many subunits in vivo as well as in vitro. By tagging protein or RNA subunits with fluorophores, activating them, and subsequently observing as the fluorophores photobleach, one obtains information on the number of subunits in a complex. The noise properties in a photobleaching time trace depend on the number of active fluorescent subunits. Thus, as fluorophores stochastically photobleach, noise properties of the time trace change stochastically, and these varying noise properties have created a challenge in identifying photobleaching steps in a time trace. Although photobleaching steps are often detected by eye, this method only works for high individual fluorophore emission signal-to-noise ratios and small numbers of fluorophores. With filtering methods or currently available algorithms, it is possible to reliably identify photobleaching steps for up to 20-30 fluorophores and signal-to-noise ratios down to ∼1. Here we present a new Bayesian method of counting steps in photobleaching time traces that takes into account stochastic noise variation in addition to complications such as overlapping photobleaching events that may arise from fluorophore interactions, as well as on-off blinking. Our method is capable of detecting ≥50 photobleaching steps even for signal-to-noise ratios as low as 0.1, can find up to ≥500 steps for more favorable noise profiles, and is computationally inexpensive. © 2016 Tsekouras et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. Skylab short-lived event alert program

    NASA Technical Reports Server (NTRS)

    Citron, R. A.

    1974-01-01

    During the three manned Skylab missions, the Center for Short-Lived Phenomena (CSLP) reported a total of 39 significant events to the Johnson Space Center (JSC) as part of the Skylab Short-Lived Event Alert Program. The telegraphed daily status reports included the names and locations of the events, the track number and revolution number during which the event could be observed, the time (GMT) to within plus or minus 2 sec when Skylab was closest to the event area, and the light condition (daylight or darkness) at that time and place. The messages sent to JSC during the Skylab 4 mission also included information pertaining to ground-truth studies and observations being conducted on the events. Photographic priorities were assigned for each event.

  8. Sensor Fusion to Infer Locations of Standing and Reaching Within the Home in Incomplete Spinal Cord Injury.

    PubMed

    Lonini, Luca; Reissman, Timothy; Ochoa, Jose M; Mummidisetty, Chaithanya K; Kording, Konrad; Jayaraman, Arun

    2017-10-01

    The objective of rehabilitation after spinal cord injury is to enable successful function in everyday life and independence at home. Clinical tests can assess whether patients are able to execute functional movements but are limited in assessing such information at home. A prototype system is developed that detects stand-to-reach activities, a movement with important functional implications, at multiple locations within a mock kitchen. Ten individuals with incomplete spinal cord injuries performed a sequence of standing and reaching tasks. The system monitored their movements by combining two sources of information: a triaxial accelerometer, placed on the subject's thigh, detected sitting or standing, and a network of radio frequency tags, wirelessly connected to a wrist-worn device, detected reaching at three locations. A threshold-based algorithm detected execution of the combined tasks and accuracy was measured by the number of correctly identified events. The system was shown to have an average accuracy of 98% for inferring when individuals performed stand-to-reach activities at each tag location within the same room. The combination of accelerometry and tags yielded accurate assessments of functional stand-to-reach activities within a home environment. Optimization of this technology could simplify patient compliance and allow clinicians to assess functional home activities.

  9. Differentiating location- and distance-based processes in memory for time: an ERP study.

    PubMed

    Curran, Tim; Friedman, William J

    2003-09-01

    Memory for the time of events may benefit from reconstructive, location-based, and distance-based processes, but these processes are difficult to dissociate with behavioral methods. Neuropsychological research has emphasized the contribution of prefrontal brain mechanisms to memory for time but has not clearly differentiated location- from distance-based processing. The present experiment recorded event-related brain potentials (ERPs) while subjects completed two different temporal memory tests, designed to emphasize either location- or distance-based processing. The subjects' reports of location-based versus distance-based strategies and the reaction time pattern validated our experimental manipulation. Late (800-1,800 msec) frontal ERP effects were related to location-based processing. The results provide support for a two-process theory of memory for time and suggest that frontal memory mechanisms are specifically related to reconstructive, location-based processing.

  10. Location and Venue | The Metastatic Niche: Models, Mechanisms and Targeting Targets into Therapeutics

    Cancer.gov

    Location and Venue **EVENT CHANGE OF LOCATION:  **Building 10 (Clinical Center) - Masur Auditorium** Helpful links to locate the Masur Auditorium on the NIH campus:  https://www.ors.od.nih.gov/maps/Pages/NIH-Visitor-Map.aspx

  11. A passive RFID-based location system for personnel and asset monitoring.

    PubMed

    Hsiao, Rong-Shue; Kao, Chun-Hao; Chen, Tian-Xiang; Chen, Jui-Lun

    2018-01-01

    Typical radio frequency identification (RFID) access control system can be ineffective if an unauthorized person tailgates an authorized person through an access area. To propose a system by using indoor locating and tracking techniques address this problem, which is to prevent unauthorized Alzheimer's and dementia patients from getting lost including by tailgating. To achieve accurate target location, passive RFID deployment strategy is studied and a fingerprinting based passive RFID localization algorithm is proposed. The proposed system was evaluated in a building environment to simulate the performance of access control. RFID reader was installed on ceiling near the access area and tags were stitched on both shoulders of the experiment subject's garments. The probability of the error distance within 0.3 m achieved 97% in the warning area; the location precision achieved 97% within 0.4 m in the monitoring area. The result showed that if an unauthorized person enters the restricted area, the system can initiate an alert signal accurately. Therefore, the proposed system is very suitable to be used in nursing home or hospital to prevent unauthorized personnel and assets entering/exiting a confined location.

  12. Event Classification and Identification Based on the Characteristic Ellipsoid of Phasor Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Jian; Diao, Ruisheng; Makarov, Yuri V.

    2011-09-23

    In this paper, a method to classify and identify power system events based on the characteristic ellipsoid of phasor measurement is presented. The decision tree technique is used to perform the event classification and identification. Event types, event locations and clearance times are identified by decision trees based on the indices of the characteristic ellipsoid. A sufficiently large number of transient events were simulated on the New England 10-machine 39-bus system based on different system configurations. Transient simulations taking into account different event types, clearance times and various locations are conducted to simulate phasor measurement. Bus voltage magnitudes and recordedmore » reactive and active power flows are used to build the characteristic ellipsoid. The volume, eccentricity, center and projection of the longest axis in the parameter space coordinates of the characteristic ellipsoids are used to classify and identify events. Results demonstrate that the characteristic ellipsoid and the decision tree are capable to detect the event type, location, and clearance time with very high accuracy.« less

  13. Method of fan sound mode structure determination computer program user's manual: Microphone location program

    NASA Technical Reports Server (NTRS)

    Pickett, G. F.; Wells, R. A.; Love, R. A.

    1977-01-01

    A computer user's manual describing the operation and the essential features of the microphone location program is presented. The Microphone Location Program determines microphone locations that ensure accurate and stable results from the equation system used to calculate modal structures. As part of the computational procedure for the Microphone Location Program, a first-order measure of the stability of the equation system was indicated by a matrix 'conditioning' number.

  14. Nuclear Emulsion Analysis Methods of Locating Neutrino Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Carolyn Lee

    2006-12-01

    The Fermilab experiment 872 (DONUT) was the first to directly observe tau neutrinos in the charged current interactionV more » $$\\tau$$+N →$$\\tau$$ +X. The observation was made using a hybrid emulsion-spectrometer detector to identify the signature kink or trident decay of the tau particle. Although nuclear emulsion has the benefit of sub-micron resolution, its use incorporates difficulties such as significant distortions and a high density of data resulting from its continuously active state. Finding events and achieving sub-micron resolution in emulsion requires a multi-pronged strategy of tracking and vertex location to deal with these inherent difficulties. By applying the methods developed in this thesis, event location efficiency can be improved from a value of 58% to 87%.« less

  15. Deep space target location with Hubble Space Telescope (HST) and Hipparcos data

    NASA Technical Reports Server (NTRS)

    Null, George W.

    1988-01-01

    Interplanetary spacecraft navigation requires accurate a priori knowledge of target positions. A concept is presented for attaining improved target ephemeris accuracy using two future Earth-orbiting optical observatories, the European Space Agency (ESA) Hipparcos observatory and the Nasa Hubble Space Telescope (HST). Assuming nominal observatory performance, the Hipparcos data reduction will provide an accurate global star catalog, and HST will provide a capability for accurate angular measurements of stars and solar system bodies. The target location concept employs HST to observe solar system bodies relative to Hipparcos catalog stars and to determine the orientation (frame tie) of these stars to compact extragalactic radio sources. The target location process is described, the major error sources discussed, the potential target ephemeris error predicted, and mission applications identified. Preliminary results indicate that ephemeris accuracy comparable to the errors in individual Hipparcos catalog stars may be possible with a more extensive HST observing program. Possible future ground and spacebased replacements for Hipparcos and HST astrometric capabilities are also discussed.

  16. Teleconnection Locator: TeleLoc

    NASA Astrophysics Data System (ADS)

    Bowen, M. K.; Duffy, D.

    2016-12-01

    Extreme climate events, such as tropical storms, droughts, and floods, have an enormous impact on all aspects of society. Being able to detect the causes of such events on a global scale is paramount to being able to predict when and where these events will occur. These teleconnections, where a small change in a closed, complex system creates drastic disturbances elsewhere in the system, are generally represented by an index, one of the most famous being the El Nino Southern Oscillation (ENSO). However, due to the enormity, complexity, and technical challenges surrounding climate and its data, it is hypothesized that many of these teleconnections have as of yet gone undiscovered. TeleLoc (Teleconnection Locator) is a machine-learning framework combining a number of techniques for finding correlations between weather trends and extreme climate events. The current focus is on connecting global trends with tropical cyclones. A combination of two data sets, The International Best Track Archive for Climate Stewardship (IBTrACS) and the Modern-Era Retrospective analysis for Research and Applications (MERRA2), are being utilized. PostGIS is used for raw data storage, and a Python API has been developed as the core of the framework. Cyclones are first clustered using a combination of Symbolic Aggregate ApproXimation (this allows for a symbolic, sequential representation of the various time-series variables of interest) and DBSCAN. This serves to break the events into subcategories, which alleviates computational load for the next step. Events which are clustered together (those with similar characteristics) are compared against global climate variables of interest, which are also converted to a symbolic form, leading up to the event using Association Rule Mining. Results will be shown where cyclones have been clustered, specifically in the West Pacific storm basin, as well as the global variable symbolic subsections with a high support that have been singled out for

  17. Identification and analysis of long duration low frequency events from microseismic data

    NASA Astrophysics Data System (ADS)

    Hu, H.; Li, A.

    2016-12-01

    Long duration low frequency (LDLF) earthquakes, which are commonly present in volcanic fields and subduction zones, have been observed from microseismic data. In this research, we have identified and located several LDLF events from a microseismic dataset acquired by surface receivers in the Eagle Ford Shale. The LDLF events are clearly identified on frequency-time plots with the central frequencies at 5-25 Hz and the duration time from tens of seconds up to 100 seconds. We pick the arrival times of the events using the envelops of the filtered data and apply a grid search method to find the source locations. These events are located at the depth around 1500 m, close to the horizontal treatment well for hydraulic fracturing. The associated phase arrivals show typical P-wave moveout trends. In addition, these events tend to migrate away from the horizontal well with time. Furthermore, these events are recorded only during the time when the rock is breaking according to the treating pressure records. Considering all these observations, we conclude that the observed LDLF events are caused by the pressure change related to fluid flow in fractures. The time-dependence source locations could have an important application to characterize the fluid path inside fractures.

  18. Determining SAFOD area microearthquake locations solely with the Pilot Hole seismic array data

    NASA Astrophysics Data System (ADS)

    Oye, Volker; Chavarria, J. Andres; Malin, Peter E.

    2004-05-01

    In August 2002, an array of 32 three-component geophones was installed in the San Andreas Fault Observatory at Depth (SAFOD) Pilot Hole (PH) at Parkfield, CA. As an independent test of surface-observation-based microearthquake locations, we have located such events using only data recorded on the PH array. We then compared these locations with locations from a combined set of PH and Parkfield High Resolution Seismic Network (HRSN) observations. We determined the uncertainties in the locations as they relate to errors in the travel time picks and the velocity model by the bootstrap method. Based on the PH and combined locations, we find that the ``C2'' cluster to the northeast of the PH has the smallest location uncertainties. Events in this cluster also have the most similar waveforms and largest magnitudes. This confirms earlier suggestions that the C2 cluster is a promising target for the SAFOD Main Hole.

  19. TE-Tracker: systematic identification of transposition events through whole-genome resequencing.

    PubMed

    Gilly, Arthur; Etcheverry, Mathilde; Madoui, Mohammed-Amin; Guy, Julie; Quadrana, Leandro; Alberti, Adriana; Martin, Antoine; Heitkam, Tony; Engelen, Stefan; Labadie, Karine; Le Pen, Jeremie; Wincker, Patrick; Colot, Vincent; Aury, Jean-Marc

    2014-11-19

    Transposable elements (TEs) are DNA sequences that are able to move from their location in the genome by cutting or copying themselves to another locus. As such, they are increasingly recognized as impacting all aspects of genome function. With the dramatic reduction in cost of DNA sequencing, it is now possible to resequence whole genomes in order to systematically characterize novel TE mobilization in a particular individual. However, this task is made difficult by the inherently repetitive nature of TE sequences, which in some eukaryotes compose over half of the genome sequence. Currently, only a few software tools dedicated to the detection of TE mobilization using next-generation-sequencing are described in the literature. They often target specific TEs for which annotation is available, and are only able to identify families of closely related TEs, rather than individual elements. We present TE-Tracker, a general and accurate computational method for the de-novo detection of germ line TE mobilization from re-sequenced genomes, as well as the identification of both their source and destination sequences. We compare our method with the two classes of existing software: specialized TE-detection tools and generic structural variant (SV) detection tools. We show that TE-Tracker, while working independently of any prior annotation, bridges the gap between these two approaches in terms of detection power. Indeed, its positive predictive value (PPV) is comparable to that of dedicated TE software while its sensitivity is typical of a generic SV detection tool. TE-Tracker demonstrates the benefit of adopting an annotation-independent, de novo approach for the detection of TE mobilization events. We use TE-Tracker to provide a comprehensive view of transposition events induced by loss of DNA methylation in Arabidopsis. TE-Tracker is freely available at http://www.genoscope.cns.fr/TE-Tracker . We show that TE-Tracker accurately detects both the source and destination of

  20. Distinct regions of the hippocampus are associated with memory for different spatial locations.

    PubMed

    Jeye, Brittany M; MacEvoy, Sean P; Karanian, Jessica M; Slotnick, Scott D

    2018-05-15

    In the present functional magnetic resonance imaging (fMRI) study, we aimed to evaluate whether distinct regions of the hippocampus were associated with spatial memory for items presented in different locations of the visual field. In Experiment 1, during the study phase, participants viewed abstract shapes in the left or right visual field while maintaining central fixation. At test, old shapes were presented at fixation and participants classified each shape as previously in the "left" or "right" visual field followed by an "unsure"-"sure"-"very sure" confidence rating. Accurate spatial memory for shapes in the left visual field was isolated by contrasting accurate versus inaccurate spatial location responses. This contrast produced one hippocampal activation in which the interaction between item type and accuracy was significant. The analogous contrast for right visual field shapes did not produce activity in the hippocampus; however, the contrast of high confidence versus low confidence right-hits produced one hippocampal activation in which the interaction between item type and confidence was significant. In Experiment 2, the same paradigm was used but shapes were presented in each quadrant of the visual field during the study phase. Accurate memory for shapes in each quadrant, exclusively masked by accurate memory for shapes in the other quadrants, produced a distinct activation in the hippocampus. A multi-voxel pattern analysis (MVPA) of hippocampal activity revealed a significant correlation between behavioral spatial location accuracy and hippocampal MVPA accuracy across participants. The findings of both experiments indicate that distinct hippocampal regions are associated with memory for different visual field locations. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Precisely locating the Klamath Falls, Oregon, earthquakes

    USGS Publications Warehouse

    Qamar, A.; Meagher, K.L.

    1993-01-01

    In this article we present preliminary results of a close-in, instrumental study of the Klamath Falls earthquake sequence, carried as a cooperative effort by scientists from the U.S Geological Survey (USGS) and universities in Washington, Orgeon, and California. In addition to obtaining much mroe accurate earthquake locations, this study has improved our understanding of the relationship between seismicity and mapped faults in the region. 

  2. Calculating the Motion and Direction of Flux Transfer Events with Cluster

    NASA Technical Reports Server (NTRS)

    Collado-Vega, Yaireska M.; Sibeck, David Gary

    2011-01-01

    We use multi-point timing analysis to determine the orientation and motion of flux transfer events (FTEs) detected by the four Cluster spacecraft on the high-latitude dayside and flank magnetopause during 2002 and 2003. During these years, the distances between the Cluster spacecraft were greater than 1000 km, providing the tetrahedral configuration needed to select events and determine velocities. Each velocity and location will be examined in detail and compared to the velocities and locations determined by the predictions of the component and antiparallel reconnection models for event formation, orientation, motion, and acceleration for a wide range of spacecraft locations and solar wind conditions.

  3. Associative Symmetry versus Independent Associations in the Memory for Object-Location Associations

    ERIC Educational Resources Information Center

    Sommer, Tobias; Rose, Michael; Buchel, Christian

    2007-01-01

    The formation of associations between objects and locations is a vital aspect of episodic memory. More specifically, remembering the location where one experienced an object and, vice versa, the object one encountered at a specific location are both important elements for the memory of an event. Whether episodic associations are holistic…

  4. Infrasound ray tracing models for real events

    NASA Astrophysics Data System (ADS)

    Averbuch, Gil; Applbaum, David; Price, Colin; Ben Horin, Yochai

    2015-04-01

    Infrasound ray tracing models for real events C. Price1, G. Averbuch1, D. Applbaum1, Y. Ben Horin2 (1) Department of Geosciences, Tel Aviv University, Israel (2) Soreq Nuclear Research Center, Yavne, Israel Ray tracing models for infrasound propagation require two atmospheric parameters: the speed of sound profile and the wind profile. The usage of global atmospheric models for the speed of sound and wind profiles raises a fundamental question: can these models provide accurate results for modeling real events that have been detected by the infrasound arrays? Moreover, can these models provide accurate results for events that occurred during extreme weather conditions? We use 2D and 3D ray tracing models based on a modified Hamiltonian for a moving medium. Radiosonde measurements enable us to update the first 20 km of both speed of sound and wind profiles. The 2009 and 2011 Sayarim calibration experiments in Israel served us as a test for the models. In order to answer the question regarding the accuracy of the model during extreme weather conditions, we simulate infrasound sprite signals that were detected by the infrasound array in Mt. Meron, Israel. The results from modeling the Sayarim experiment provided us sufficient insight to conclude that ray tracing modeling can provide accurate results for real events that occurred during fair weather conditions. We conclude that the time delay in the model of the 2009 experiment is due to lack of accuracy in the wind and speed of sound profiles. Perturbed profiles provide accurate results. Earlier arrivals in 2011 are a result of the assumption that the earth is flat (no topography) and the use of local radiosonde measurements for the entire model. Using local radiosonde measurements only for part of the model and neglecting them on other parts prevents the early arrivals. We were able to determine which sprite is the one that got detected in the infrasound array as well as providing a height range for the sprite

  5. Alignment of leading-edge and peak-picking time of arrival methods to obtain accurate source locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roussel-Dupre, R.; Symbalisty, E.; Fox, C.

    2009-08-01

    The location of a radiating source can be determined by time-tagging the arrival of the radiated signal at a network of spatially distributed sensors. The accuracy of this approach depends strongly on the particular time-tagging algorithm employed at each of the sensors. If different techniques are used across the network, then the time tags must be referenced to a common fiducial for maximum location accuracy. In this report we derive the time corrections needed to temporally align leading-edge, time-tagging techniques with peak-picking algorithms. We focus on broadband radio frequency (RF) sources, an ionospheric propagation channel, and narrowband receivers, but themore » final results can be generalized to apply to any source, propagation environment, and sensor. Our analytic results are checked against numerical simulations for a number of representative cases and agree with the specific leading-edge algorithm studied independently by Kim and Eng (1995) and Pongratz (2005 and 2007).« less

  6. Applying time-reverse-imaging techniques to locate individual low-frequency earthquakes on the San Andreas fault near Cholame, California

    NASA Astrophysics Data System (ADS)

    Horstmann, T.; Harrington, R. M.; Cochran, E.; Shelly, D. R.

    2013-12-01

    Observations of non-volcanic tremor have become ubiquitous in recent years. In spite of the abundance of observations, locating tremor remains a difficult task because of the lack of distinctive phase arrivals. Here we use time-reverse-imaging techniques that do not require identifying phase arrivals to locate individual low-frequency-earthquakes (LFEs) within tremor episodes on the San Andreas fault near Cholame, California. Time windows of 1.5-second duration containing LFEs are selected from continuously recorded waveforms of the local seismic network filtered between 1-5 Hz. We propagate the time-reversed seismic signal back through the subsurface using a staggered-grid finite-difference code. Assuming all rebroadcasted waveforms result from similar wave fields at the source origin, we search for wave field coherence in time and space to obtain the source location and origin time where the constructive interference is a maximum. We use an interpolated velocity model with a grid spacing of 100 m and a 5 ms time step to calculate the relative curl field energy amplitudes for each rebroadcasted seismogram every 50 ms for each grid point in the model. Finally, we perform a grid search for coherency in the curl field using a sliding time window, and taking the absolute value of the correlation coefficient to account for differences in radiation pattern. The highest median cross-correlation coefficient value over at a given grid point indicates the source location for the rebroadcasted event. Horizontal location errors based on the spatial extent of the highest 10% cross-correlation coefficient are on the order of 4 km, and vertical errors on the order of 3 km. Furthermore, a test of the method using earthquake data shows that the method produces an identical hypocentral location (within errors) as that obtained by standard ray-tracing methods. We also compare the event locations to a LFE catalog that locates the LFEs from stacked waveforms of repeated LFEs

  7. Seismic Source Locations and Parameters for Sparce Networks by Matching Observed Seismograms to Semi-Empirical Synthetic Seismograms

    NASA Astrophysics Data System (ADS)

    Marshall, M. E.; Salzberg, D. H.

    2006-05-01

    The purpose of this study is to further demonstrate the accuracy of full-waveform earthquake location method using semi-empirical synthetic waveforms and received data from two or more regional stations. To test the method, well-constrained events from southern and central California are being used as a testbed. A suite of regional California events is being processed. Our focus is on aftershocks of the Parkfield event, the Hector Mine event, and the San Simian event. In all three cases, the aftershock locations are known to within 1 km. For Parkfield, with its extremely dense local network, the events are located to within 300 m or better. We are processing the data using a grid spacing of 0.5 km in three dimensions. Often, the minimum in residual from the semi-empirical waveform matching is within one grid point of the 'ground truth' location, which is as good as can be expected. We will present the results and compare those to the event locations reported in catalogs using the dense local seismic networks that are present in California. The preliminary results indicate that matched-waveform locations are able to resolve the locations with accuracies better than GT5, and possibly approaching GT1. These results only require two stations at regional distances and differing azimuths. One of the disadvantages of the California testbed is that all of the earthquakes in a particular region typically have very similar focal mechanisms. In theory, the semi-empirical approach should allow us to generate the well-matched synthetic waveforms regardless of the varying mechanisms. To verify this aspect, we apply the technique to relocate and simulate the JUNCTION nuclear test (March 26, 1992) using waveforms from the Little Skull Mountain mainshock.

  8. Knowing what, where, and when: event comprehension in language processing.

    PubMed

    Kukona, Anuenue; Altmann, Gerry T M; Kamide, Yuki

    2014-10-01

    We investigated the retrieval of location information, and the deployment of attention to these locations, following (described) event-related location changes. In two visual world experiments, listeners viewed arrays with containers like a bowl, jar, pan, and jug, while hearing sentences like "The boy will pour the sweetcorn from the bowl into the jar, and he will pour the gravy from the pan into the jug. And then, he will taste the sweetcorn". At the discourse-final "sweetcorn", listeners fixated context-relevant "Target" containers most (jar). Crucially, we also observed two forms of competition: listeners fixated containers that were not directly referred to but associated with "sweetcorn" (bowl), and containers that played the same role as Targets (goals of moving events; jug), more than distractors (pan). These results suggest that event-related location changes are encoded across representations that compete for comprehenders' attention, such that listeners retrieve, and fixate, locations that are not referred to in the unfolding language, but related to them via object or role information. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Carbene footprinting accurately maps binding sites in protein-ligand and protein-protein interactions

    NASA Astrophysics Data System (ADS)

    Manzi, Lucio; Barrow, Andrew S.; Scott, Daniel; Layfield, Robert; Wright, Timothy G.; Moses, John E.; Oldham, Neil J.

    2016-11-01

    Specific interactions between proteins and their binding partners are fundamental to life processes. The ability to detect protein complexes, and map their sites of binding, is crucial to understanding basic biology at the molecular level. Methods that employ sensitive analytical techniques such as mass spectrometry have the potential to provide valuable insights with very little material and on short time scales. Here we present a differential protein footprinting technique employing an efficient photo-activated probe for use with mass spectrometry. Using this methodology the location of a carbohydrate substrate was accurately mapped to the binding cleft of lysozyme, and in a more complex example, the interactions between a 100 kDa, multi-domain deubiquitinating enzyme, USP5 and a diubiquitin substrate were located to different functional domains. The much improved properties of this probe make carbene footprinting a viable method for rapid and accurate identification of protein binding sites utilizing benign, near-UV photoactivation.

  10. Funnel metadynamics as accurate binding free-energy method

    PubMed Central

    Limongelli, Vittorio; Bonomi, Massimiliano; Parrinello, Michele

    2013-01-01

    A detailed description of the events ruling ligand/protein interaction and an accurate estimation of the drug affinity to its target is of great help in speeding drug discovery strategies. We have developed a metadynamics-based approach, named funnel metadynamics, that allows the ligand to enhance the sampling of the target binding sites and its solvated states. This method leads to an efficient characterization of the binding free-energy surface and an accurate calculation of the absolute protein–ligand binding free energy. We illustrate our protocol in two systems, benzamidine/trypsin and SC-558/cyclooxygenase 2. In both cases, the X-ray conformation has been found as the lowest free-energy pose, and the computed protein–ligand binding free energy in good agreement with experiments. Furthermore, funnel metadynamics unveils important information about the binding process, such as the presence of alternative binding modes and the role of waters. The results achieved at an affordable computational cost make funnel metadynamics a valuable method for drug discovery and for dealing with a variety of problems in chemistry, physics, and material science. PMID:23553839

  11. Earthquakes triggered by silent slip events on Kīlauea volcano, Hawaii

    USGS Publications Warehouse

    Segall, Paul; Desmarais, Emily K.; Shelly, David; Miklius, Asta; Cervelli, Peter F.

    2006-01-01

    Slow-slip events, or ‘silent earthquakes’, have recently been discovered in a number of subduction zones including the Nankai trough1, 2, 3 in Japan, Cascadia4, 5, and Guerrero6 in Mexico, but the depths of these events have been difficult to determine from surface deformation measurements. Although it is assumed that these silent earthquakes are located along the plate megathrust, this has not been proved. Slow slip in some subduction zones is associated with non-volcanic tremor7, 8, but tremor is difficult to locate and may be distributed over a broad depth range9. Except for some events on the San Andreas fault10, slow-slip events have not yet been associated with high-frequency earthquakes, which are easily located. Here we report on swarms of high-frequency earthquakes that accompany otherwise silent slips on Kīlauea volcano, Hawaii. For the most energetic event, in January 2005, the slow slip began before the increase in seismicity. The temporal evolution of earthquakes is well explained by increased stressing caused by slow slip, implying that the earthquakes are triggered. The earthquakes, located at depths of 7–8 km, constrain the slow slip to be at comparable depths, because they must fall in zones of positive Coulomb stress change. Triggered earthquakes accompanying slow-slip events elsewhere might go undetected if background seismicity rates are low. Detection of such events would help constrain the depth of slow slip, and could lead to a method for quantifying the increased hazard during slow-slip events, because triggered events have the potential to grow into destructive earthquakes.

  12. Low-cost asset tracking using location-aware camera phones

    NASA Astrophysics Data System (ADS)

    Chen, David; Tsai, Sam; Kim, Kyu-Han; Hsu, Cheng-Hsin; Singh, Jatinder Pal; Girod, Bernd

    2010-08-01

    Maintaining an accurate and up-to-date inventory of one's assets is a labor-intensive, tedious, and costly operation. To ease this difficult but important task, we design and implement a mobile asset tracking system for automatically generating an inventory by snapping photos of the assets with a smartphone. Since smartphones are becoming ubiquitous, construction and deployment of our inventory management solution is simple and costeffective. Automatic asset recognition is achieved by first segmenting individual assets out of the query photo and then performing bag-of-visual-features (BoVF) image matching on the segmented regions. The smartphone's sensor readings, such as digital compass and accelerometer measurements, can be used to determine the location of each asset, and this location information is stored in the inventory for each recognized asset. As a special case study, we demonstrate a mobile book tracking system, where users snap photos of books stacked on bookshelves to generate a location-aware book inventory. It is shown that segmenting the book spines is very important for accurate feature-based image matching into a database of book spines. Segmentation also provides the exact orientation of each book spine, so more discriminative upright local features can be employed for improved recognition. This system's mobile client has been implemented for smartphones running the Symbian or Android operating systems. The client enables a user to snap a picture of a bookshelf and to subsequently view the recognized spines in the smartphone's viewfinder. Two different pose estimates, one from BoVF geometric matching and the other from segmentation boundaries, are both utilized to accurately draw the boundary of each spine in the viewfinder for easy visualization. The BoVF representation also allows matching each photo of a bookshelf rack against a photo of the entire bookshelf, and the resulting feature matches are used in conjunction with the smartphone

  13. Event Reconstruction Techniques in NOvA

    NASA Astrophysics Data System (ADS)

    Baird, M.; Bian, J.; Messier, M.; Niner, E.; Rocco, D.; Sachdev, K.

    2015-12-01

    The NOvA experiment is a long-baseline neutrino oscillation experiment utilizing the NuMI beam generated at Fermilab. The experiment will measure the oscillations within a muon neutrino beam in a 300 ton Near Detector located underground at Fermilab and a functionally-identical 14 kiloton Far Detector placed 810 km away. The detectors are liquid scintillator tracking calorimeters with a fine-grained cellular structure that provides a wealth of information for separating the different particle track and shower topologies. Each detector has its own challenges with the Near Detector seeing multiple overlapping neutrino interactions in each event and the Far Detector having a large background of cosmic rays due to being located on the surface. A series of pattern recognition techniques have been developed to go from event records, to spatially and temporally separating individual interactions, to vertexing and tracking, and particle identification. This combination of methods to achieve the full event reconstruction will be discussed.

  14. Improved integrated sniper location system

    NASA Astrophysics Data System (ADS)

    Figler, Burton D.; Spera, Timothy J.

    1999-01-01

    In July of 1995, Lockheed Martin IR Imaging Systems, of Lexington, Massachusetts began the development of an integrated sniper location system for the Defense Advanced Research Projects Agency and for the Department of the Navy's Naval Command Control & Ocean Surveillance Center, RDTE Division in San Diego, California. The I-SLS integrates acoustic and uncooled infrared sensing technologies to provide an affordable and highly effective sniper detection and location capability. This system, its performance and results from field tests at Camp Pendleton, California, in October 1996 were described in a paper presented at the November 1996 SPIE Photonics East Symposium1 on Enabling Technologies for Law Enforcement and Security. The I-SLS combines an acoustic warning system with an uncooled infrared warning system. The acoustic warning system has been developed by SenTech, Inc., of Lexington, Massachusetts. This acoustic warning system provides sniper detection and coarse location information based upon the muzzle blast of the sniper's weapon and/or upon the shock wave produced by the sniper's bullet, if the bullet is supersonic. The uncooled infrared warning system provides sniper detection and fine location information based upon the weapon's muzzle flash. In addition, the uncooled infrared warning system can provide thermal imagery that can be used to accurately locate and identify the sniper. Combining these two technologies improves detection probability, reduces false alarm rate and increases utility. In the two years since the last report of the integrated sniper location system, improvements have been made and a second field demonstration was planned. In this paper, we describe the integrated sniper location system modifications in preparation for the new field demonstration. In addition, fundamental improvements in the uncooled infrared sensor technology continue to be made. These improvements include higher sensitivity (lower minimum resolvable temperature

  15. Explosion Source Location Study Using Collocated Acoustic and Seismic Networks in Israel

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Gitterman, Y.; Arrowsmith, S.; Ben-Horin, Y.

    2013-12-01

    We explore a joined analysis of seismic and infrasonic signals for improvement in automatic monitoring of small local/regional events, such as construction and quarry blasts, military chemical explosions, sonic booms, etc. using collocated seismic and infrasonic networks recently build in Israel (ISIN) in the frame of the project sponsored by the Bi-national USA-Israel Science Foundation (BSF). The general target is to create an automatic system, which will provide detection, location and identification of explosions in real-time or close-to-real time manner. At the moment the network comprises 15 stations hosting a microphone and seismometer (or accelerometer), operated by the Geophysical Institute of Israel (GII), plus two infrasonic arrays, operated by the National Data Center, Soreq: IOB in the South (Negev desert) and IMA in the North of Israel (Upper Galilee),collocated with the IMS seismic array MMAI. The study utilizes a ground-truth data-base of numerous Rotem phosphate quarry blasts, a number of controlled explosions for demolition of outdated ammunitions and experimental surface explosions for a structure protection research, at the Sayarim Military Range. A special event, comprising four military explosions in a neighboring country, that provided both strong seismic (up to 400 km) and infrasound waves (up to 300 km), is also analyzed. For all of these events the ground-truth coordinates and/or the results of seismic location by the Israel Seismic Network (ISN) have been provided. For automatic event detection and phase picking we tested the new recursive picker, based on Statistically optimal detector. The results were compared to the manual picks. Several location techniques have been tested using the ground-truth event recordings and the preliminary results obtained have been compared to the ground-truth locations: 1) a number of events have been located as intersection of azimuths estimated using the wide-band F-K analysis technique applied to the

  16. Identification of 4th intercostal space using sternal notch to xiphoid length for accurate electrocardiogram lead placement.

    PubMed

    Day, Kevin; Oliva, Isabel; Krupinski, Elizabeth; Marcus, Frank

    2015-01-01

    Precordial ECG lead placement is difficult in obese patients with increased chest wall soft tissues due to inaccurate palpation of the intercostal spaces. We investigated whether the length of the sternum (distance between the sternal notch and xiphoid process) can accurately predict the location of the 4th intercostal space, which is the traditional location for V1 lead position. Fifty-five consecutive adult chest computed tomography examinations were reviewed for measurements. The sternal notch to right 4th intercostal space distance was 67% of the sternal notch to xiphoid process length with an overall correlation of r=0.600 (p<0.001). The above measurement may be utilized to locate the 4th intercostal space for accurate placement of the precordial electrodes in adults in whom the 4th intercostal space cannot be found by physical exam. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. hypoDD-A Program to Compute Double-Difference Hypocenter Locations

    USGS Publications Warehouse

    Waldhauser, Felix

    2001-01-01

    HypoDD is a Fortran computer program package for relocating earthquakes with the double-difference algorithm of Waldhauser and Ellsworth (2000). This document provides a brief introduction into how to run and use the programs ph2dt and hypoDD to compute double-difference (DD) hypocenter locations. It gives a short overview of the DD technique, discusses the data preprocessing using ph2dt, and leads through the earthquake relocation process using hypoDD. The appendices include the reference manuals for the two programs and a short description of auxiliary programs and example data. Some minor subroutines are presently in the c language, and future releases will be in c. Earthquake location algorithms are usually based on some form of Geiger’s method, the linearization of the travel time equation in a first order Taylor series that relates the difference between the observed and predicted travel time to unknown adjustments in the hypocentral coordinates through the partial derivatives of travel time with respect to the unknowns. Earthquakes can be located individually with this algorithm, or jointly when other unknowns link together the solutions to indivdual earthquakes, such as station corrections in the joint hypocenter determination (JHD) method, or the earth model in seismic tomography. The DD technique (described in detail in Waldhauser and Ellsworth, 2000) takes advantage of the fact that if the hypocentral separation between two earthquakes is small compared to the event-station distance and the scale length of velocity heterogeneity, then the ray paths between the source region and a common station are similar along almost the entire ray path (Fréchet, 1985; Got et al., 1994). In this case, the difference in travel times for two events observed at one station can be attributed to the spatial offset between the events with high accuracy. DD equations are built by differencing Geiger’s equation for earthquake location. In this way, the residual between

  18. Locating Local Earthquakes Using Single 3-Component Broadband Seismological Data

    NASA Astrophysics Data System (ADS)

    Das, S. B.; Mitra, S.

    2015-12-01

    We devised a technique to locate local earthquakes using single 3-component broadband seismograph and analyze the factors governing the accuracy of our result. The need for devising such a technique arises in regions of sparse seismic network. In state-of-the-art location algorithms, a minimum of three station recordings are required for obtaining well resolved locations. However, the problem arises when an event is recorded by less than three stations. This may be because of the following reasons: (a) down time of stations in a sparse network; (b) geographically isolated regions with limited logistic support to setup large network; (c) regions of insufficient economy for financing multi-station network and (d) poor signal-to-noise ratio for smaller events at most stations, except the one in its closest vicinity. Our technique provides a workable solution to the above problematic scenarios. However, our methodology is strongly dependent on the velocity model of the region. Our method uses a three step processing: (a) ascertain the back-azimuth of the event from the P-wave particle motion recorded on the horizontal components; (b) estimate the hypocentral distance using the S-P time; and (c) ascertain the emergent angle from the vertical and radial components. Once this is obtained, one can ray-trace through the 1-D velocity model to estimate the hypocentral location. We test our method on synthetic data, which produces results with 99% precision. With observed data, the accuracy of our results are very encouraging. The precision of our results depend on the signal-to-noise ratio (SNR) and choice of the right band-pass filter to isolate the P-wave signal. We used our method on minor aftershocks (3 < mb < 4) of the 2011 Sikkim earthquake using data from the Sikkim Himalayan network. Location of these events highlight the transverse strike-slip structure within the Indian plate, which was observed from source mechanism study of the mainshock and larger aftershocks.

  19. Discovery and Characterization of a Caustic Crossing Microlensing Event in the Small Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.; King, L. J.; Lehner, M. J.; Marshall, S. L.; Minniti, D.; Peterson, B. A.; Pratt, M. R.; Quinn, P. J.; Rhie, S. H.; Rodgers, A. W.; Stetson, P. B.; Stubbs, C. W.; Sutherland, W.; Tomaney, A.; Vandehei, T.

    1999-06-01

    We present photometric observations and analysis of the second microlensing event detected toward the Small Magellanic Cloud (SMC), MACHO Alert 98-SMC-1. This event was detected early enough to allow intensive observation of the light curve. These observations revealed 98-SMC-1 to be the first caustic crossing binary microlensing event toward the Magellanic Clouds to be discovered in progress. Frequent coverage of the evolving light curve allowed an accurate prediction for the date of the source crossing out of the lens caustic structure. The caustic crossing temporal width, along with the angular size of the source star, measures the proper motion of the lens with respect to the source and thus allows an estimate of the location of the lens. Lenses located in the Galactic halo would have a velocity projected to the SMC of v̂~1500 kms-1, while an SMC lens would typically have v̂~60 kms-1. The event light curve allows us to obtain a unique fit to the parameters of the binary lens and to estimate the proper motion of the lensing system. We have performed a joint fit to the MACHO/GMAN data presented here, including recent EROS data of this event from Afonso and collaborators. These joint data are sufficient to constrain the time t* for the lens to move an angle equal to the source angular radius: t*=0.116+/-0.010 days. We estimate a radius for the lensed source of R*=1.1+/-0.1 Rsolar from its unblended color and magnitude. This yields a projected velocity of v̂=76+/-10 kms-1. Only 0.12% of halo lenses would be expected to have a v̂ value at least as small as this, while 38% of SMC lenses would be expected to have v̂ as large as this. This implies that the lensing system is more likely to reside in the SMC than in the Galactic halo. Similar observations of future Magellanic Cloud microlensing events will help to determine the contribution of MACHOS to the Galaxy's dark halo.

  20. Microseismic Monitoring of Stimulating Shale Gas Reservoir in SW China: 1. An Improved Matching and Locating Technique for Downhole Monitoring

    NASA Astrophysics Data System (ADS)

    Meng, Xiaobo; Chen, Haichao; Niu, Fenglin; Tang, Youcai; Yin, Chen; Wu, Furong

    2018-02-01

    We introduce an improved matching and locating technique to detect and locate microseismic events (-4 < ML < 0) associated with hydraulic fracturing treatment. We employ a set of representative master events to act as template waveforms and detect slave events that strongly resemble master events through stacking cross correlograms of both P and S waves between the template waveforms and the continuous records of the monitoring array. Moreover, the residual moveout in the cross correlograms across the array is used to locate slave events relative to the corresponding master event. In addition, P wave polarization constraint is applied to resolve the lateral extent of slave events in the case of unfavorable array configuration. We first demonstrate the detectability and location accuracy of the proposed approach with a pseudo-synthetic data set. Compared to the matched filter analysis, the proposed approach can significantly enhance detectability at low false alarm rate and yield robust location estimates of very low SNR events, particularly along the vertical direction. Then, we apply the method to a real microseismic data set acquired in the Weiyuan shale reservoir of China in November of 2014. The expanded microseismic catalog provides more easily interpretable spatiotemporal evolution of microseismicity, which is investigated in detail in a companion paper.

  1. Earthquake Relocation in the Middle East with Geodetically-Calibrated Events

    NASA Astrophysics Data System (ADS)

    Brengman, C.; Barnhart, W. D.

    2017-12-01

    Regional and global earthquake catalogs in tectonically active regions commonly contain mislocated earthquakes that impede efforts to address first order characteristics of seismogenic strain release and to monitor anthropogenic seismic events through the Comprehensive Nuclear-Test-Ban Treaty. Earthquake mislocations are particularly limiting in the plate boundary zone between the Arabia and Eurasia plates of Iran, Pakistan, and Turkey where earthquakes are commonly mislocated by 20+ kilometers and hypocentral depths are virtually unconstrained. Here, we present preliminary efforts to incorporate calibrated earthquake locations derived from Interferometric Synthetic Aperture Radar (InSAR) observations into a relocated catalog of seismicity in the Middle East. We use InSAR observations of co-seismic deformation to determine the locations, geometries, and slip distributions of small to moderate magnitude (M4.8+) crustal earthquakes. We incorporate this catalog of calibrated event locations, along with other seismologically-calibrated earthquake locations, as "priors" into a fully Bayesian multi-event relocation algorithm that relocates all teleseismically and regionally recorded earthquakes over the time span 1970-2017, including calibrated and uncalibrated events. Our relocations are conducted using cataloged phase picks and BayesLoc. We present a suite of sensitivity tests for the time span of 2003-2014 to explore the impacts of our input parameters (i.e., how a point source is defined from a finite fault inversion) on the behavior of the event relocations, potential improvements to depth estimates, the ability of the relocation to recover locations outside of the time span in which there are InSAR observations, and the degree to which our relocations can recover "known" calibrated earthquake locations that are not explicitly included as a-priori constraints. Additionally, we present a systematic comparison of earthquake relocations derived from phase picks of two

  2. Astronomy on Tap: Public Outreach Events in Bars

    NASA Astrophysics Data System (ADS)

    Rice, E. L.; Levine, B. W.

    2016-12-01

    Astronomy on Tap public outreach events are as easy to organise or as elaborate as you would like them to be. In addition to communicating cutting-edge research and fundamental concepts to the public, Astronomy on Tap events showcase the passion, creativity and diversity of scientists, facilitate personal and meaningful interactions between scientists and the general public, and offer networking and professional development opportunities for scientists. Astronomy on Tap organisers provide a growing cadre of resources for starting similar events, which have so far taken place in twenty locations around the world, mainly in the United States but also in Canada, Chile, and Taiwan, reaching a total of almost 15 000 people. Through this reflection on the Astronomy on Tap project we invite you to consider whether you could adopt aspects of the Astronomy on Tap model for existing outreach programmes, or even organise a new satellite event in your location.

  3. Event Memory: A Theory of Memory for Laboratory, Autobiographical, and Fictional Events

    PubMed Central

    Rubin, David C.; Umanath, Sharda

    2015-01-01

    An event memory is a mental construction of a scene recalled as a single occurrence. It therefore requires the hippocampus and ventral visual stream needed for all scene construction. The construction need not come with a sense of reliving or be made by a participant in the event, and it can be a summary of occurrences from more than one encoding. The mental construction, or physical rendering, of any scene must be done from a specific location and time; this introduces a ‘self’ located in space and time, which is a necessary, but need not be a sufficient, condition for a sense of reliving. We base our theory on scene construction rather than reliving because this allows the integration of many literatures and because there is more accumulated knowledge about scene construction’s phenomenology, behavior, and neural basis. Event memory differs from episodic memory in that it does not conflate the independent dimensions of whether or not a memory is relived, is about the self, is recalled voluntarily, or is based on a single encoding with whether it is recalled as a single occurrence of a scene. Thus, we argue that event memory provides a clearer contrast to semantic memory, which also can be about the self, be recalled voluntarily, and be from a unique encoding; allows for a more comprehensive dimensional account of the structure of explicit memory; and better accounts for laboratory and real world behavioral and neural results, including those from neuropsychology and neuroimaging, than does episodic memory. PMID:25330330

  4. Cardiovascular Events Following Smoke-Free Legislations: An Updated Systematic Review and Meta-Analysis

    PubMed Central

    Jones, Miranda R.; Barnoya, Joaquin; Stranges, Saverio; Losonczy, Lia; Navas-Acien, Ana

    2014-01-01

    Background Legislations banning smoking in indoor public places and workplaces are being implemented worldwide to protect the population from secondhand smoke exposure. Several studies have reported reductions in hospitalizations for acute coronary events following the enactment of smoke-free laws. Objective We set out to conduct a systematic review and meta-analysis of epidemiologic studies examining how legislations that ban smoking in indoor public places impact the risk of acute coronary events. Methods We searched MEDLINE, EMBASE, and relevant bibliographies including previous systematic reviews for studies that evaluated changes in acute coronary events, following implementation of smoke-free legislations. Studies were identified through December 2013. We pooled relative risk (RR) estimates for acute coronary events comparing post- vs. pre-legislation using inverse-variance weighted random-effects models. Results Thirty-one studies providing estimates for 47 locations were included. The legislations were implemented between 1991 and 2010. Following the enactment of smoke-free legislations, there was a 12 % reduction in hospitalizations for acute coronary events (pooled RR: 0.88, 95 % CI: 0.85–0.90). Reductions were 14 % in locations that implemented comprehensive legislations compared to an 8 % reduction in locations that only had partial restrictions. In locations with reductions in smoking prevalence post-legislation above the mean (2.1 % reduction) there was a 14 % reduction in events compared to 10 % in locations below the mean. The RRs for acute coronary events associated with enacting smoke-free legislation were 0.87 vs. 0.89 in locations with smoking prevalence pre-legislation above and below the mean (23.1 %), and 0.87 vs. 0.89 in studies from the Americas vs. other regions. Conclusion The implementation of smoke-free legislations was related to reductions in acute coronary event hospitalizations in most populations evaluated. Benefits are greater

  5. Encoding negative events under stress: high subjective arousal is related to accurate emotional memory despite misinformation exposure.

    PubMed

    Hoscheidt, Siobhan M; LaBar, Kevin S; Ryan, Lee; Jacobs, W Jake; Nadel, Lynn

    2014-07-01

    Stress at encoding affects memory processes, typically enhancing, or preserving, memory for emotional information. These effects have interesting implications for eyewitness accounts, which in real-world contexts typically involve encoding an aversive event under stressful conditions followed by potential exposure to misinformation. The present study investigated memory for a negative event encoded under stress and subsequent misinformation endorsement. Healthy young adults participated in a between-groups design with three experimental sessions conducted 48 h apart. Session one consisted of a psychosocial stress induction (or control task) followed by incidental encoding of a negative slideshow. During session two, participants were asked questions about the slideshow, during which a random subgroup was exposed to misinformation. Memory for the slideshow was tested during the third session. Assessment of memory accuracy across stress and no-stress groups revealed that stress induced just prior to encoding led to significantly better memory for the slideshow overall. The classic misinformation effect was also observed - participants exposed to misinformation were significantly more likely to endorse false information during memory testing. In the stress group, however, memory accuracy and misinformation effects were moderated by arousal experienced during encoding of the negative event. Misinformed-stress group participants who reported that the negative slideshow elicited high arousal during encoding were less likely to endorse misinformation for the most aversive phase of the story. Furthermore, these individuals showed better memory for components of the aversive slideshow phase that had been directly misinformed. Results from the current study provide evidence that stress and high subjective arousal elicited by a negative event act concomitantly during encoding to enhance emotional memory such that the most aversive aspects of the event are well remembered and

  6. Estimating the Probability of Negative Events

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  7. Locating the source of spreading in temporal networks

    NASA Astrophysics Data System (ADS)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Yi, Dongyun

    2017-02-01

    The topological structure of many real networks changes with time. Thus, locating the sources of a temporal network is a creative and challenging problem, as the enormous size of many real networks makes it unfeasible to observe the state of all nodes. In this paper, we propose an algorithm to solve this problem, named the backward temporal diffusion process. The proposed algorithm calculates the shortest temporal distance to locate the transmission source. We assume that the spreading process can be modeled as a simple diffusion process and by consensus dynamics. To improve the location accuracy, we also adopt four strategies to select which nodes should be observed by ranking their importance in the temporal network. Our paper proposes a highly accurate method for locating the source in temporal networks and is, to the best of our knowledge, a frontier work in this field. Moreover, our framework has important significance for controlling the transmission of diseases or rumors and formulating immediate immunization strategies.

  8. How accurate is accident data in road safety research? An application of vehicle black box data regarding pedestrian-to-taxi accidents in Korea.

    PubMed

    Chung, Younshik; Chang, IlJoon

    2015-11-01

    Recently, the introduction of vehicle black box systems or in-vehicle video event data recorders enables the driver to use the system to collect more accurate crash information such as location, time, and situation at the pre-crash and crash moment, which can be analyzed to find the crash causal factors more accurately. This study presents the vehicle black box system in brief and its application status in Korea. Based on the crash data obtained from the vehicle black box system, this study analyzes the accuracy of the crash data collected from existing road crash data recording method, which has been recorded by police officers based on accident parties' statements or eyewitness's account. The analysis results show that the crash data observed by the existing method have an average of 84.48m of spatial difference and standard deviation of 157.75m as well as average 29.05min of temporal error and standard deviation of 19.24min. Additionally, the average and standard deviation of crash speed errors were found to be 9.03km/h and 7.21km/h, respectively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Event generators for address event representation transmitters

    NASA Astrophysics Data System (ADS)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  10. The Challenges from Extreme Climate Events for Sustainable Development in Amazonia: the Acre State Experience

    NASA Astrophysics Data System (ADS)

    Araújo, M. D. N. M.

    2015-12-01

    In the past ten years Acre State, located in Brazil´s southwestern Amazonia, has confronted sequential and severe extreme events in the form of droughts and floods. In particular, the droughts and forest fires of 2005 and 2010, the 2012 flood within Acre, the 2014 flood of the Madeira River which isolated Acre for two months from southern Brazil, and the most severe flooding throughout the state in 2015 shook the resilience of Acrean society. The accumulated costs of these events since 2005 have exceeded 300 million dollars. For the last 17 years, successive state administrations have been implementing a socio-environmental model of development that strives to link sustainable economic production with environmental conservation, particularly for small communities. In this context, extreme climate events have interfered significantly with this model, increasing the risks of failure. The impacts caused by these events on development in the state have been exacerbated by: a) limitations in monitoring; b) extreme events outside of Acre territory (Madeira River Flood) affecting transportation systems; c) absence of reliable information for decision-making; and d) bureaucratic and judicial impediments. Our experience in these events have led to the following needs for scientific input to reduce the risk of disasters: 1) better monitoring and forecasting of deforestation, fires, and hydro-meteorological variables; 2) ways to increase risk perception in communities; 3) approaches to involve more effectively local and regional populations in the response to disasters; 4) more accurate measurements of the economic and social damages caused by these disasters. We must improve adaptation to and mitigation of current and future extreme climate events and implement a robust civil defense, adequate to these new challenges.

  11. Microseismic Events Detection on Xishancun Landslide, Sichuan Province, China

    NASA Astrophysics Data System (ADS)

    Sheng, M.; Chu, R.; Wei, Z.

    2016-12-01

    On landslide, the slope movement and the fracturing of the rock mass often lead to microearthquakes, which are recorded as weak signals on seismographs. The distribution characteristics of temporal and spatial regional unstability as well as the impact of external factors on the unstable regions can be understand and analyzed by monitoring those microseismic events. Microseismic method can provide some information inside the landslide, which can be used as supplementary of geodetic methods for monitoring the movement of landslide surface. Compared to drilling on landslide, microseismic method is more economical and safe. Xishancun Landslide is located about 60km northwest of Wenchuan earthquake centroid, it keep deforming after the earthquake, which greatly increases the probability of disasters. In the autumn of 2015, 30 seismometers were deployed on the landslide for 3 months with intervals of 200 500 meters. First, we used regional earthquakes for time correction of seismometers to eliminate the influence of inaccuracy GPS clocks and the subsurface structure of stations. Due to low velocity of the loose medium, the travel time difference of microseismic events on the landslide up to 5s. According to travel time and waveform characteristics, we found many microseismic events and converted them into envelopes as templates, then we used a sliding-window cross-correlation technique based on waveform envelope to detect the other microseismic events. Consequently, 100 microseismic events were detected with the waveforms recorded on all seismometers. Based on the location, we found most of them located on the front of the landslide while the others located on the back end. The bottom and top of the landslide accumulated considerable energy and deformed largely, radiated waves could be recorded by all stations. What's more, the bottom with more events seemed very active. In addition, there were many smaller events happened in middle part of the landslide where released

  12. Picking vs Waveform based detection and location methods for induced seismicity monitoring

    NASA Astrophysics Data System (ADS)

    Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.

    2017-04-01

    Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for

  13. OGLE-2014-BLG-0289: Precise Characterization of a Quintuple-peak Gravitational Microlensing Event

    NASA Astrophysics Data System (ADS)

    Udalski, A.; Han, C.; Bozza, V.; Gould, A.; Bond, I. A.; and; Mróz, P.; Skowron, J.; Wyrzykowski, Ł.; Szymański, M. K.; Soszyński, I.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Kozłowski, S.; The OGLE Collaboration; Abe, F.; Barry, R.; Bennett, D. P.; Bhattacharya, A.; Donachie, M.; Evans, P.; Fukui, A.; Hirao, Y.; Itow, Y.; Kawasaki, K.; Koshimoto, N.; Li, M. C. A.; Ling, C. H.; Masuda, K.; Matsubara, Y.; Miyazaki, S.; Munakata, H.; Muraki, Y.; Nagakane, M.; Ohnishi, K.; Ranc, C.; Rattenbury, N.; Saito, T.; Sharan, A.; Sullivan, D. J.; Sumi, T.; Suzuki, D.; Tristram, P. J.; Yamada, T.; Yonehara, A.; The MOA Collaboration; Street, R. A.; Tsapras, Y.; Bachelet, E.; Bramich, D. M.; DÁgo, G.; Dominik, M.; Figuera Jaimes, R.; Horne, K.; Hundertmark, M.; Kains, N.; Menzies, J.; Schmidt, R.; Snodgrass, C.; Steele, I. A.; Wambsganss, J.; Robonet Collaboration; Pogge, R. W.; Jung, Y. K.; Shin, I.-G.; Yee, J. C.; Kim, W.-T.; The μFun Collaboration; Beichman, C.; Carey, S.; Calchi Novati, S.; Zhu, W.; The Spitzer Team

    2018-01-01

    We present the analysis of the binary-microlensing event OGLE-2014-BLG-0289. The event light curve exhibits five very unusual peaks, four of which were produced by caustic crossings and the other by a cusp approach. It is found that the quintuple-peak features of the light curve provide tight constraints on the source trajectory, enabling us to precisely and accurately measure the microlensing parallax {π }{{E}}. Furthermore, the three resolved caustics allow us to measure the angular Einstein radius {θ }{{E}}. From the combination of {π }{{E}} and {θ }{{E}}, the physical lens parameters are uniquely determined. It is found that the lens is a binary composed of two M dwarfs with masses {M}1=0.52+/- 0.04 {M}ȯ and {M}2=0.42+/- 0.03 {M}ȯ separated in projection by {a}\\perp =6.4+/- 0.5 {au}. The lens is located in the disk with a distance of {D}{{L}}=3.3+/- 0.3 {kpc}. The reason for the absence of a lensing signal in the Spitzer data is that the time of observation corresponds to the flat region of the light curve.

  14. Acoustic Location of Lightning Using Interferometric Techniques

    NASA Astrophysics Data System (ADS)

    Erives, H.; Arechiga, R. O.; Stock, M.; Lapierre, J. L.; Edens, H. E.; Stringer, A.; Rison, W.; Thomas, R. J.

    2013-12-01

    Acoustic arrays have been used to accurately locate thunder sources in lightning flashes. The acoustic arrays located around the Magdalena mountains of central New Mexico produce locations which compare quite well with source locations provided by the New Mexico Tech Lightning Mapping Array. These arrays utilize 3 outer microphones surrounding a 4th microphone located at the center, The location is computed by band-passing the signal to remove noise, and then computing the cross correlating the outer 3 microphones with respect the center reference microphone. While this method works very well, it works best on signals with high signal to noise ratios; weaker signals are not as well located. Therefore, methods are being explored to improve the location accuracy and detection efficiency of the acoustic location systems. The signal received by acoustic arrays is strikingly similar to th signal received by radio frequency interferometers. Both acoustic location systems and radio frequency interferometers make coherent measurements of a signal arriving at a number of closely spaced antennas. And both acoustic and interferometric systems then correlate these signals between pairs of receivers to determine the direction to the source of the received signal. The primary difference between the two systems is the velocity of propagation of the emission, which is much slower for sound. Therefore, the same frequency based techniques that have been used quite successfully with radio interferometers should be applicable to acoustic based measurements as well. The results presented here are comparisons between the location results obtained with current cross correlation method and techniques developed for radio frequency interferometers applied to acoustic signals. The data were obtained during the summer 2013 storm season using multiple arrays sensitive to both infrasonic frequency and audio frequency acoustic emissions from lightning. Preliminary results show that

  15. Infrasound Monitoring of Local, Regional and Global Events

    DTIC Science & Technology

    2007-09-01

    detect and associate signals from the March 9th 2005 eruption at Mount Saint Helens, and locate the event to be within 5 km of the caldera . The...are located within 5 km of the center of the caldera at Mount Saint Helens. Figure 4. Locations of grid nodes that were automatically associated...photograph, and are located within 5 km of the center of the caldera . 29th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring

  16. Hierarchical Event Descriptors (HED): Semi-Structured Tagging for Real-World Events in Large-Scale EEG

    PubMed Central

    Bigdely-Shamlo, Nima; Cockfield, Jeremy; Makeig, Scott; Rognon, Thomas; La Valle, Chris; Miyakoshi, Makoto; Robbins, Kay A.

    2016-01-01

    Real-world brain imaging by EEG requires accurate annotation of complex subject-environment interactions in event-rich tasks and paradigms. This paper describes the evolution of the Hierarchical Event Descriptor (HED) system for systematically describing both laboratory and real-world events. HED version 2, first described here, provides the semantic capability of describing a variety of subject and environmental states. HED descriptions can include stimulus presentation events on screen or in virtual worlds, experimental or spontaneous events occurring in the real world environment, and events experienced via one or multiple sensory modalities. Furthermore, HED 2 can distinguish between the mere presence of an object and its actual (or putative) perception by a subject. Although the HED framework has implicit ontological and linked data representations, the user-interface for HED annotation is more intuitive than traditional ontological annotation. We believe that hiding the formal representations allows for a more user-friendly interface, making consistent, detailed tagging of experimental, and real-world events possible for research users. HED is extensible while retaining the advantages of having an enforced common core vocabulary. We have developed a collection of tools to support HED tag assignment and validation; these are available at hedtags.org. A plug-in for EEGLAB (sccn.ucsd.edu/eeglab), CTAGGER, is also available to speed the process of tagging existing studies. PMID:27799907

  17. eqMAXEL: A new automatic earthquake location algorithm implementation for Earthworm

    NASA Astrophysics Data System (ADS)

    Lisowski, S.; Friberg, P. A.; Sheen, D. H.

    2017-12-01

    A common problem with automated earthquake location systems for a local to regional scale seismic network is false triggering and false locations inside the network caused by larger regional to teleseismic distance earthquakes. This false location issue also presents a problem for earthquake early warning systems where societal impacts of false alarms can be very expensive. Towards solving this issue, Sheen et al. (2016) implemented a robust maximum-likelihood earthquake location algorithm known as MAXEL. It was shown with both synthetics and real-data for a small number of arrivals, that large regional events were easily identifiable through metrics in the MAXEL algorithm. In the summer of 2017, we collaboratively implemented the MAXEL algorithm into a fully functional Earthworm module and tested it in regions of the USA where false detections and alarming are observed. We show robust improvement in the ability of the Earthworm system to filter out regional and teleseismic events that would have falsely located inside the network using the traditional Earthworm hypoinverse solution. We also explore using different grid sizes in the implementation of the MAXEL algorithm, which was originally designed with South Korea as the target network size.

  18. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sezen, Halil; Aldemir, Tunc; Denning, R.

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  19. 25 CFR 214.28 - Location of sites for mines and buildings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Location of sites for mines and buildings. 214.28 Section... and buildings. In event of disagreement between two or more mineral lessees regarding sites for the location of wells, mines, buildings, plants, etc., the same shall be determined by the superintendent after...

  20. Acoustic emission source location in complex structures using full automatic delta T mapping technique

    NASA Astrophysics Data System (ADS)

    Al-Jumaili, Safaa Kh.; Pearson, Matthew R.; Holford, Karen M.; Eaton, Mark J.; Pullin, Rhys

    2016-05-01

    An easy to use, fast to apply, cost-effective, and very accurate non-destructive testing (NDT) technique for damage localisation in complex structures is key for the uptake of structural health monitoring systems (SHM). Acoustic emission (AE) is a viable technique that can be used for SHM and one of the most attractive features is the ability to locate AE sources. The time of arrival (TOA) technique is traditionally used to locate AE sources, and relies on the assumption of constant wave speed within the material and uninterrupted propagation path between the source and the sensor. In complex structural geometries and complex materials such as composites, this assumption is no longer valid. Delta T mapping was developed in Cardiff in order to overcome these limitations; this technique uses artificial sources on an area of interest to create training maps. These are used to locate subsequent AE sources. However operator expertise is required to select the best data from the training maps and to choose the correct parameter to locate the sources, which can be a time consuming process. This paper presents a new and improved fully automatic delta T mapping technique where a clustering algorithm is used to automatically identify and select the highly correlated events at each grid point whilst the "Minimum Difference" approach is used to determine the source location. This removes the requirement for operator expertise, saving time and preventing human errors. A thorough assessment is conducted to evaluate the performance and the robustness of the new technique. In the initial test, the results showed excellent reduction in running time as well as improved accuracy of locating AE sources, as a result of the automatic selection of the training data. Furthermore, because the process is performed automatically, this is now a very simple and reliable technique due to the prevention of the potential source of error related to manual manipulation.

  1. Development and assessment of Memorial Sloan Kettering Cancer Center's Surgical Secondary Events grading system.

    PubMed

    Strong, Vivian E; Selby, Luke V; Sovel, Mindy; Disa, Joseph J; Hoskins, William; Dematteo, Ronald; Scardino, Peter; Jaques, David P

    2015-04-01

    Studying surgical secondary events is an evolving effort with no current established system for database design, standard reporting, or definitions. Using the Clavien-Dindo classification as a guide, in 2001 we developed a Surgical Secondary Events database based on grade of event and required intervention to begin prospectively recording and analyzing all surgical secondary events (SSE). Events are prospectively entered into the database by attending surgeons, house staff, and research staff. In 2008 we performed a blinded external audit of 1,498 operations that were randomly selected to examine the quality and reliability of the data. Of 4,284 operations, 1,498 were audited during the third quarter of 2008. Of these operations, 79 % (N = 1,180) did not have a secondary event while 21 % (N = 318) had an identified event; 91 % of operations (1,365) were correctly entered into the SSE database. Also 97 % (129 of 133) of missed secondary events were grades I and II. There were 3 grade III (2 %) and 1 grade IV (1 %) secondary event that were missed. There were no missed grade 5 secondary events. Grade III-IV events are more accurately collected than grade I-II events. Robust and accurate secondary events data can be collected by clinicians and research staff, and these data can safely be used for quality improvement projects and research.

  2. Improving Kinematic Accuracy of Soft Wearable Data Gloves by Optimizing Sensor Locations

    PubMed Central

    Kim, Dong Hyun; Lee, Sang Wook; Park, Hyung-Soon

    2016-01-01

    Bending sensors enable compact, wearable designs when used for measuring hand configurations in data gloves. While existing data gloves can accurately measure angular displacement of the finger and distal thumb joints, accurate measurement of thumb carpometacarpal (CMC) joint movements remains challenging due to crosstalk between the multi-sensor outputs required to measure the degrees of freedom (DOF). To properly measure CMC-joint configurations, sensor locations that minimize sensor crosstalk must be identified. This paper presents a novel approach to identifying optimal sensor locations. Three-dimensional hand surface data from ten subjects was collected in multiple thumb postures with varied CMC-joint flexion and abduction angles. For each posture, scanned CMC-joint contours were used to estimate CMC-joint flexion and abduction angles by varying the positions and orientations of two bending sensors. Optimal sensor locations were estimated by the least squares method, which minimized the difference between the true CMC-joint angles and the joint angle estimates. Finally, the resultant optimal sensor locations were experimentally validated. Placing sensors at the optimal locations, CMC-joint angle measurement accuracies improved (flexion, 2.8° ± 1.9°; abduction, 1.9° ± 1.2°). The proposed method for improving the accuracy of the sensing system can be extended to other types of soft wearable measurement devices. PMID:27240364

  3. Examining ERP correlates of recognition memory: Evidence of accurate source recognition without recollection

    PubMed Central

    Addante, Richard, J.; Ranganath, Charan; Yonelinas, Andrew, P.

    2012-01-01

    Recollection is typically associated with high recognition confidence and accurate source memory. However, subjects sometimes make accurate source memory judgments even for items that are not confidently recognized, and it is not known whether these responses are based on recollection or some other memory process. In the current study, we measured event related potentials (ERPs) while subjects made item and source memory confidence judgments in order to determine whether recollection supported accurate source recognition responses for items that were not confidently recognized. In line with previous studies, we found that recognition memory was associated with two ERP effects: an early on-setting FN400 effect, and a later parietal old-new effect [Late Positive Component (LPC)], which have been associated with familiarity and recollection, respectively. The FN400 increased gradually with item recognition confidence, whereas the LPC was only observed for highly confident recognition responses. The LPC was also related to source accuracy, but only for items that had received a high confidence item recognition response; accurate source judgments to items that were less confidently recognized did not exhibit the typical ERP correlate of recollection or familiarity, but rather showed a late, broadly distributed negative ERP difference. The results indicate that accurate source judgments of episodic context can occur even when recollection fails. PMID:22548808

  4. Low latency counter event indication

    DOEpatents

    Gara, Alan G [Mount Kisco, NY; Salapura, Valentina [Chappaqua, NY

    2008-09-16

    A hybrid counter array device for counting events with interrupt indication includes a first counter portion comprising N counter devices, each for counting signals representing event occurrences and providing a first count value representing lower order bits. An overflow bit device associated with each respective counter device is additionally set in response to an overflow condition. The hybrid counter array includes a second counter portion comprising a memory array device having N addressable memory locations in correspondence with the N counter devices, each addressable memory location for storing a second count value representing higher order bits. An operatively coupled control device monitors each associated overflow bit device and initiates incrementing a second count value stored at a corresponding memory location in response to a respective overflow bit being set. The incremented second count value is compared to an interrupt threshold value stored in a threshold register, and, when the second counter value is equal to the interrupt threshold value, a corresponding "interrupt arm" bit is set to enable a fast interrupt indication. On a subsequent roll-over of the lower bits of that counter, the interrupt will be fired.

  5. Low latency counter event indication

    DOEpatents

    Gara, Alan G.; Salapura, Valentina

    2010-08-24

    A hybrid counter array device for counting events with interrupt indication includes a first counter portion comprising N counter devices, each for counting signals representing event occurrences and providing a first count value representing lower order bits. An overflow bit device associated with each respective counter device is additionally set in response to an overflow condition. The hybrid counter array includes a second counter portion comprising a memory array device having N addressable memory locations in correspondence with the N counter devices, each addressable memory location for storing a second count value representing higher order bits. An operatively coupled control device monitors each associated overflow bit device and initiates incrementing a second count value stored at a corresponding memory location in response to a respective overflow bit being set. The incremented second count value is compared to an interrupt threshold value stored in a threshold register, and, when the second counter value is equal to the interrupt threshold value, a corresponding "interrupt arm" bit is set to enable a fast interrupt indication. On a subsequent roll-over of the lower bits of that counter, the interrupt will be fired.

  6. 16 CFR 500.6 - Net quantity of contents declaration, location.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., location. 500.6 Section 500.6 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT... contents separately and accurately stated on the principal display panel. (b) The declaration of net quantity shall appear as a distinct item on the principal display panel, shall be separated (by at least a...

  7. Does Drinking Location Matter? Profiles of Risky Single-Occasion Drinking by Location and Alcohol-Related Harm among Young Men.

    PubMed

    Bähler, Caroline; Dey, Michelle; Dermota, Petra; Foster, Simon; Gmel, Gerhard; Mohler-Kuo, Meichun

    2014-01-01

    In adolescents and young adults, acute consequences like injuries account for a substantial proportion of alcohol-related harm, especially in risky single-occasion (RSO) drinkers. The primary aim of the study was to characterize different drinking profiles in RSO drinkers according to drinking locations and their relationship to negative, alcohol-related consequences. The sample consisted of 2746 young men from the Cohort Study on Substance Use Risk Factors who had reported drinking six or more drinks on a single-occasion at least monthly over the preceding 12 months. Principal component analysis on the frequency and amount of drinking at 11 different locations was conducted, and 2 distinguishable components emerged: a non-party-dimension (loading high on theater/cinema, sport clubs, other clubs/societies, restaurants, and sport events) and a party-dimension (loading high on someone else's home, pubs/bars, discos/nightclubs, outdoor public places, special events, and home). Differential impacts of drinking location profiles were observed on severe negative alcohol-related consequences (SAC). Relative to those classified as low or intermediate in both dimensions, no significant difference experiencing SAC was found among those who were classified as high in the non-party-dimension only. However, those who were classified as high in the party-dimension alone or in both dimensions were more likely to experience SAC. These differential effects remained after adjusting for alcohol consumption (volume and risky single-occasion drinking), personality traits, and peer-influence [adjusted OR = 0.83 (0.68-1.02), 1.57 (1.27-1.96), and 1.72 (1.23-2.41), respectively], indicating independent effects of drinking location on SAC. The inclusion of sociodemographic factors did not alter this association. The fact that this cluster of party-dimension locations seems to predispose young men to experiencing SAC has important implications for alcohol control policies.

  8. How accurate is unenhanced multidetector-row CT (MDCT) for localization of renal calculi?

    PubMed

    Goetschi, Stefan; Umbehr, Martin; Ullrich, Stephan; Glenck, Michael; Suter, Stefan; Weishaupt, Dominik

    2012-11-01

    To investigate the correlation between unenhanced MDCT and intraoperative findings with regard to the exact anatomical location of renal calculi. Fifty-nine patients who underwent unenhanced MDCT for suspected urinary stone disease, and who underwent subsequent flexible ureterorenoscopy (URS) as treatment of nephrolithiasis were included in this retrospective study. All MDCT data sets were independently reviewed by three observers with different degrees of experience in reading CT. Each observer was asked to indicate presence and exact anatomical location of any calcification within pyelocaliceal system, renal papilla or renal cortex. Results were compared to intraoperative findings which have been defined as standard of reference. Calculi not described at surgery, but present on MDCT data were counted as renal cortex calcifications. Overall 166 calculi in 59 kidneys have been detected on MDCT, 100 (60.2%) were located in the pyelocaliceal system and 66 (39.8%) in the renal parenchyma. Of the 100 pyelocaliceal calculi, 84 (84%) were correctly located on CT data sets by observer 1, 62 (62%) by observer 2, and 71 (71%) by observer 3. Sensitivity/specificity was 90-94% and 50-100% if only pyelocaliceal calculi measuring >4 mm in size were considered. For pyelocaliceal calculi≤4 mm in size diagnostic performance of MDCT was inferior. Compared to flexible URS, unenhanced MDCT is accurate for distinction between pyelocaliceal calculi and renal parenchyma calcifications if renal calculi are >4 mm in size. For smaller renal calculi, unenhanced MDCT is less accurate and distinction between a pyelocaliceal calculus and renal parenchyma calcification is difficult. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. The measurement and monitoring of surgical adverse events.

    PubMed

    Bruce, J; Russell, E M; Mollison, J; Krukowski, Z H

    2001-01-01

    Surgical adverse events contribute significantly to postoperative morbidity, yet the measurement and monitoring of events is often imprecise and of uncertain validity. Given the trend of decreasing length of hospital stay and the increase in use of innovative surgical techniques--particularly minimally invasive and endoscopic procedures--accurate measurement and monitoring of adverse events is crucial. The aim of this methodological review was to identify a selection of common and potentially avoidable surgical adverse events and to assess whether they could be reliably and validly measured, to review methods for monitoring their occurrence and to identify examples of effective monitoring systems for selected events. This review is a comprehensive attempt to examine the quality of the definition, measurement, reporting and monitoring of selected events that are known to cause significant postoperative morbidity and mortality. METHODS - SELECTION OF SURGICAL ADVERSE EVENTS: Four adverse events were selected on the basis of their frequency of occurrence and likelihood of evidence of measurement and monitoring: (1) surgical wound infection; (2) anastomotic leak; (3) deep vein thrombosis (DVT); (4) surgical mortality. Surgical wound infection and DVT are common events that cause significant postoperative morbidity. Anastomotic leak is a less common event, but risk of fatality is associated with delay in recognition, detection and investigation. Surgical mortality was selected because of the effort known to have been invested in developing systems for monitoring surgical death, both in the UK and internationally. Systems for monitoring surgical wound infection were also included in the review. METHODS - LITERATURE SEARCH: Thirty separate, systematic literature searches of core health and biomedical bibliographic databases (MEDLINE, EMBASE, CINAHL, HealthSTAR and the Cochrane Library) were conducted. The reference lists of retrieved articles were reviewed to locate

  10. Drinking Location and Pregaming as Predictors of Alcohol Intoxication Among Mandated College Students

    PubMed Central

    Miller, Mary Beth; Borsari, Brian; Fernandez, Anne C.; Yurasek, Ali M.; Hustad, John T. P.

    2016-01-01

    Background Both drinking location and pregaming have been associated with heavy alcohol use among college students, yet the manner by which they uniquely contribute to alcohol intoxication remains unclear. Objective The current study examined the unique utility of drinking location and pregaming in predicting alcohol intoxication among college students who violated campus alcohol policy. Method Between 2011 and 2012, mandated college students who reported drinking prior to their referral events (N=212, 41% female, 80% White, Mage =19.4 y) completed a computerized assessment of drinking location and related behaviors as part of larger research trial. Chi-squared statistics, t-tests, one-way analyses of covariance, and regression were used to examine study aims. Results Participants were most likely (44%) to report drinking in off-campus housing prior to the referral event, and approximately half (47%) reported pregaming. Alcohol intoxication on the night of the referral event differed significantly as a function of both drinking location and pregaming, but pregaming did not moderate the association between drinking location and alcohol intoxication among mandated students. Female birth sex, pregaming, and drinking at either fraternities or off-campus housing predicted greater levels of alcohol intoxication on the night of the referral incident, while drinking in a residence hall/dorm predicted lower intoxication. Conclusions/Importance Drinking location and pregaming are distinct predictors of alcohol intoxication among mandated college students. Future interventions may benefit from targeting both where and how college students consume alcohol. PMID:27070480

  11. Drinking Location and Pregaming as Predictors of Alcohol Intoxication Among Mandated College Students.

    PubMed

    Miller, Mary Beth; Borsari, Brian; Fernandez, Anne C; Yurasek, Ali M; Hustad, John T P

    2016-07-02

    Both drinking location and pregaming have been associated with heavy alcohol use among college students, yet the manner by which they uniquely contribute to alcohol intoxication remains unclear. The current study examined the unique utility of drinking location and pregaming in predicting alcohol intoxication among college students who violated campus alcohol policy. Between 2011 and 2012, mandated college students who reported drinking prior to their referral events (N = 212, 41% female, 80% White, Mage = 19.4 y) completed a computerized assessment of drinking location and related behaviors as part of larger research trial. Chi-squared statistics, t-tests, one-way analyses of covariance, and regression were used to examine study aims. Participants were most likely (44%) to report drinking in off-campus housing prior to the referral event, and approximately half (47%) reported pregaming. Alcohol intoxication on the night of the referral event differed significantly as a function of both drinking location and pregaming, but pregaming did not moderate the association between drinking location and alcohol intoxication among mandated students. Female birth sex, pregaming, and drinking at either fraternities or off-campus housing predicted greater levels of alcohol intoxication on the night of the referral incident, while drinking in a residence hall/dorm predicted lower intoxication. Drinking location and pregaming are distinct predictors of alcohol intoxication among mandated college students. Future interventions may benefit from targeting both where and how college students consume alcohol.

  12. Place field assembly distribution encodes preferred locations

    PubMed Central

    Mamad, Omar; Stumpp, Lars; McNamara, Harold M.; Ramakrishnan, Charu; Deisseroth, Karl; Reilly, Richard B.

    2017-01-01

    The hippocampus is the main locus of episodic memory formation and the neurons there encode the spatial map of the environment. Hippocampal place cells represent location, but their role in the learning of preferential location remains unclear. The hippocampus may encode locations independently from the stimuli and events that are associated with these locations. We have discovered a unique population code for the experience-dependent value of the context. The degree of reward-driven navigation preference highly correlates with the spatial distribution of the place fields recorded in the CA1 region of the hippocampus. We show place field clustering towards rewarded locations. Optogenetic manipulation of the ventral tegmental area demonstrates that the experience-dependent place field assembly distribution is directed by tegmental dopaminergic activity. The ability of the place cells to remap parallels the acquisition of reward context. Our findings present key evidence that the hippocampal neurons are not merely mapping the static environment but also store the concurrent context reward value, enabling episodic memory for past experience to support future adaptive behavior. PMID:28898248

  13. Automatic Seismic-Event Classification with Convolutional Neural Networks.

    NASA Astrophysics Data System (ADS)

    Bueno Rodriguez, A.; Titos Luzón, M.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Active volcanoes exhibit a wide range of seismic signals, providing vast amounts of unlabelled volcano-seismic data that can be analyzed through the lens of artificial intelligence. However, obtaining high-quality labelled data is time-consuming and expensive. Deep neural networks can process data in their raw form, compute high-level features and provide a better representation of the input data distribution. These systems can be deployed to classify seismic data at scale, enhance current early-warning systems and build extensive seismic catalogs. In this research, we aim to classify spectrograms from seven different seismic events registered at "Volcán de Fuego" (Colima, Mexico), during four eruptive periods. Our approach is based on convolutional neural networks (CNNs), a sub-type of deep neural networks that can exploit grid structure from the data. Volcano-seismic signals can be mapped into a grid-like structure using the spectrogram: a representation of the temporal evolution in terms of time and frequency. Spectrograms were computed from the data using Hamming windows with 4 seconds length, 2.5 seconds overlapping and 128 points FFT resolution. Results are compared to deep neural networks, random forest and SVMs. Experiments show that CNNs can exploit temporal and frequency information, attaining a classification accuracy of 93%, similar to deep networks 91% but outperforming SVM and random forest. These results empirically show that CNNs are powerful models to classify a wide range of volcano-seismic signals, and achieve good generalization. Furthermore, volcano-seismic spectrograms contains useful discriminative information for the CNN, as higher layers of the network combine high-level features computed for each frequency band, helping to detect simultaneous events in time. Being at the intersection of deep learning and geophysics, this research enables future studies of how CNNs can be used in volcano monitoring to accurately determine the detection and

  14. Discovering anomalous events from urban informatics data

    NASA Astrophysics Data System (ADS)

    Jayarajah, Kasthuri; Subbaraju, Vigneshwaran; Weerakoon, Dulanga; Misra, Archan; Tam, La Thanh; Athaide, Noel

    2017-05-01

    Singapore's "smart city" agenda is driving the government to provide public access to a broader variety of urban informatics sources, such as images from traffic cameras and information about buses servicing different bus stops. Such informatics data serves as probes of evolving conditions at different spatiotemporal scales. This paper explores how such multi-modal informatics data can be used to establish the normal operating conditions at different city locations, and then apply appropriate outlier-based analysis techniques to identify anomalous events at these selected locations. We will introduce the overall architecture of sociophysical analytics, where such infrastructural data sources can be combined with social media analytics to not only detect such anomalous events, but also localize and explain them. Using the annual Formula-1 race as our candidate event, we demonstrate a key difference between the discriminative capabilities of different sensing modes: while social media streams provide discriminative signals during or prior to the occurrence of such an event, urban informatics data can often reveal patterns that have higher persistence, including before and after the event. In particular, we shall demonstrate how combining data from (i) publicly available Tweets, (ii) crowd levels aboard buses, and (iii) traffic cameras can help identify the Formula-1 driven anomalies, across different spatiotemporal boundaries.

  15. Improvements of the offshore earthquake locations in the Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Chen, Ta-Yi; Hsu, Hsin-Chih

    2017-04-01

    Since 2014 the Earthworm Based Earthquake Alarm Reporting (eBEAR) system has been operated and been used to issue warnings to schools. In 2015 the system started to provide warnings to the public in Taiwan via television and the cell phone. Online performance of the eBEAR system indicated that the average reporting times afforded by the system are approximately 15 and 28 s for inland and offshore earthquakes, respectively. The eBEAR system in average can provide more warning time than the current EEW system (3.2 s and 5.5 s for inland and offshore earthquakes, respectively). However, offshore earthquakes were usually located poorly because only P-wave arrivals were used in the eBEAR system. Additionally, in the early stage of the earthquake early warning system, only fewer stations are available. The poor station coverage may be a reason to answer why offshore earthquakes are difficult to locate accurately. In the Geiger's inversion procedure of earthquake location, we need to put an initial hypocenter and origin time into the location program. For the initial hypocenter, we defined some test locations on the offshore area instead of using the average of locations from triggered stations. We performed 20 programs concurrently running the Geiger's method with different pre-defined initial position to locate earthquakes. We assume that if the program with the pre-defined initial position is close to the true earthquake location, during the iteration procedure of the Geiger's method the processing time of this program should be less than others. The results show that using pre-defined locations for trial-hypocenter in the inversion procedure is able to improve the accurate of offshore earthquakes. Especially for EEW system, in the initial stage of the EEW system, only use 3 or 5 stations to locate earthquakes may lead to bad results because of poor station coverage. In this study, the pre-defined trial-locations provide a feasible way to improve the estimations of

  16. Single Event Effects (SEE) Testing of Embedded DSP Cores within Microsemi RTAX4000D Field Programmable Gate Array (FPGA) Devices

    NASA Technical Reports Server (NTRS)

    Perez, Christopher E.; Berg, Melanie D.; Friendlich, Mark R.

    2011-01-01

    Motivation for this work is: (1) Accurately characterize digital signal processor (DSP) core single-event effect (SEE) behavior (2) Test DSP cores across a large frequency range and across various input conditions (3) Isolate SEE analysis to DSP cores alone (4) Interpret SEE analysis in terms of single-event upsets (SEUs) and single-event transients (SETs) (5) Provide flight missions with accurate estimate of DSP core error rates and error signatures.

  17. Envisioning the times of future events: The role of personal goals.

    PubMed

    Ben Malek, Hédi; Berna, Fabrice; D'Argembeau, Arnaud

    2018-05-25

    Episodic future thinking refers to the human capacity to imagine or simulate events that might occur in one's personal future. Previous studies have shown that personal goals guide the construction and organization of episodic future thoughts, and here we sought to investigate the role of personal goals in the process of locating imagined events in time. Using a think-aloud protocol, we found that dates were directly accessed more frequently for goal-related than goal-unrelated future events, and the goal-relevance of events was a significant predictor of direct access to temporal information on a trial-by-trial basis. Furthermore, when an event was not directly dated, references to anticipated lifetime periods were more frequently used as a strategy to determine when a goal-related event might occur. Together, these findings shed new light on the mechanisms by which personal goals contribute to the location of imagined events in future times. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. A cyber-event correlation framework and metrics

    NASA Astrophysics Data System (ADS)

    Kang, Myong H.; Mayfield, Terry

    2003-08-01

    In this paper, we propose a cyber-event fusion, correlation, and situation assessment framework that, when instantiated, will allow cyber defenders to better understand the local, regional, and global cyber-situation. This framework, with associated metrics, can be used to guide assessment of our existing cyber-defense capabilities, and to help evaluate the state of cyber-event correlation research and where we must focus our future cyber-event correlation research. The framework, based on the cyber-event gathering activities and analysis functions, consists of five operational steps, each of which provides a richer set of contextual information to support greater situational understanding. The first three steps are categorically depicted as increasingly richer and broader-scoped contexts achieved through correlation activity, while in the final two steps, these richer contexts are achieved through analytical activities (situation assessment, and threat analysis & prediction). Category 1 Correlation focuses on the detection of suspicious activities and the correlation of events from a single cyber-event source. Category 2 Correlation clusters the same or similar events from multiple detectors that are located at close proximity and prioritizes them. Finally, the events from different time periods and event sources at different location/regions are correlated at Category 3 to recognize the relationship among different events. This is the category that focuses on the detection of large-scale and coordinated attacks. The situation assessment step (Category 4) focuses on the assessment of cyber asset damage and the analysis of the impact on missions. The threat analysis and prediction step (Category 5) analyzes attacks based on attack traces and predicts the next steps. Metrics that can distinguish correlation and cyber-situation assessment tools for each category are also proposed.

  19. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  20. Identification of source locations for atmospheric dry deposition of heavy metals during yellow-sand events in Seoul, Korea in 1998 using hybrid receptor models

    NASA Astrophysics Data System (ADS)

    Han, Young-Ji; Holsen, Thomas M.; Hopke, Philip K.; Cheong, Jang-Pyo; Kim, Ho; Yi, Seung-Muk

    2004-10-01

    Elemental dry deposition fluxes were measured using dry deposition plates from March to June 1998 in Seoul, Korea. During this spring sampling period several yellow-sand events characterized by long-range transport from China and Mongolia impacted the area. Understanding the impact of yellow-sand events on atmospheric dry deposition is critical to managing the heavy metal levels in the environment in Korea. In this study, the measured flux of a primarily crustal metal, Al and an anthropogenic metal, Pb was used with two hybrid receptor models, potential source contribution function (PSCF) and residence time weighted concentration (RTWC) for locating sources of heavy metals associated with atmospheric dry deposition fluxes during the yellow-sand events in Seoul, Korea. The PSCF using a criterion value of the 75th percentile of the measured dry deposition fluxes and RTWC results using the measured elemental dry deposition fluxes agreed well and consistently showed that there were large potential source areas in the Gobi Desert in China and Mongolia and industrial areas near Tianjin, Tangshan, and Shenyang in China. Major industrial areas of Shenyang, Fushun, and Anshan, the Central China loess plateau, the Gobi Desert, and the Alashan semi-desert in China were identified to be major source areas for the measured Pb flux in Seoul, Korea. For Al, the main industrial areas of Tangshan, Tianjin and Beijing, the Gobi Desert, the Alashan semi-desert, and the Central China loess plateau were found to be the major source areas. These results indicate that both anthropogenic sources such as industrial areas and natural sources such as deserts contribute to the high dry deposition fluxes of both Pb and Al in Seoul, Korea during yellow-sand events. RTWC resolved several high potential source areas. Modeling results indicated that the long-range transport of Al and Pb from China during yellow-sand events as well as non-yellow-sand spring daytimes increased atmospheric dry

  1. 10 CFR 950.21 - Notification of covered event.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... event; (2) The duration of the delay in the schedule for construction, testing and full power operation... and full power operation, including the dates of system level construction or testing that had been... information is accurate and complete to the sponsor's knowledge and belief. ...

  2. The Collaborative Heliophysics Events Knowledgebase

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Schuler, D.; Cheung, C.

    2010-12-01

    The Collaborative Heliophysics Events Knowledgebase (CHEK) leverages and integrates the existing resources developed by HEK for SDO (Hurlburt et al. 2010) to provide a collaborative framework for heliophysics researchers. This framework will enable an environment were researches can not only identify and locate relevant data, but can deploy a social network for sharing and expanding knowledge about heliophysical events. CHEK will expand the HEK and key HEK clients into the heliosphere and geospace, and create a heliophysics social network. We describe our design and goals of the CHEK project and discuss its relation to Citizen Science in the heliosphere. Hurlburt, N et al. 2010, “A Heliophysics Event Knowledgebase for Solar Dynamics Observatory,” Sol Phys., in press

  3. Locating and decoding barcodes in fuzzy images captured by smart phones

    NASA Astrophysics Data System (ADS)

    Deng, Wupeng; Hu, Jiwei; Liu, Quan; Lou, Ping

    2017-07-01

    With the development of barcodes for commercial use, people's requirements for detecting barcodes by smart phone become increasingly pressing. The low quality of barcode image captured by mobile phone always affects the decoding and recognition rates. This paper focuses on locating and decoding EAN-13 barcodes in fuzzy images. We present a more accurate locating algorithm based on segment length and high fault-tolerant rate algorithm for decoding barcodes. Unlike existing approaches, location algorithm is based on the edge segment length of EAN -13 barcodes, while our decoding algorithm allows the appearance of fuzzy region in barcode image. Experimental results are performed on damaged, contaminated and scratched digital images, and provide a quite promising result for EAN -13 barcode location and decoding.

  4. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  5. Testing the global capabilities of the Antelope software suite: fast location and Mb determination of teleseismic events using the ASAIN and GSN seismic networks

    NASA Astrophysics Data System (ADS)

    Pesaresi, D.; Russi, M.; Plasencia, M.; Cravos, C.

    2009-04-01

    The Italian National Institute for Oceanography and Experimental Geophysics (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, OGS) is running the Antarctic Seismographic Argentinean Italian Network (ASAIN), made of 5 seismic stations located in the Scotia Sea region in Antarctica and in Argentina: data from these stations are transferred in real time to the OGS headquarters in Trieste (Italy) via satellite links. OGS is also running, in close cooperation with the Friuli-Venezia Giulia Civil Defense, the North East (NI) Italy seismic network, making use of the Antelope commercial software suite from BRTT as the main acquisition system. As a test to check the global capabilities of Antelope, we set up an instance of Antelope acquiring data in real time from both the regional ASAIN seismic network in Antarctica and a subset of the Global Seismic Network (GSN) funded by the Incorporated Research Institution for Seismology (IRIS). The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for real time access to waveform required in this study. Preliminary results over 1 month period indicated that about 82% of the earthquakes with magnitude M>5.0 listed in the PDE catalogue of the National Earthquake Information Center (NEIC) of the United States Geological Survey (USGS) were also correctly detected by Antelope, with an average location error of 0.05 degrees and average body wave magnitude Mb estimation error below 0.1. The average time difference between event origin time and the actual time of event determination by Antelope was of about 45': the comparison with 20', the IASPEI91 P-wave travel time for 180 degrees distance, and 25', the estimate of our test system data latency, indicate that Antelope is a serious candidate for regional and global early warning systems. Updated figures calculated over a longer period of time will be presented and discussed.

  6. Corrigendum: Earthquakes triggered by silent slip events on Kīlauea volcano, Hawaii

    USGS Publications Warehouse

    Segall, Paul; Desmarais, Emily K.; Shelly, David; Miklius, Asta; Cervelli, Peter

    2006-01-01

    There was a plotting error in Fig. 1 that inadvertently displays earthquakes for the incorrect time interval. The location of earthquakes during the two-day-long slow-slip event of January 2005 are shown here in the corrected Fig. 1. Because the incorrect locations were also used in the Coulomb stress-change (CSC) calculation, the error could potentially have biased our interpretation of the depth of the slow-slip event, although in fact it did not. Because nearly all of the earthquakes, both background and triggered, are landward of the slow-slip event and at similar depths (6.5–8.5 km), the impact on the CSC calculations is negligible (Fig. 2; compare with Fig. 4 in original paper). The error does not alter our conclusion that the triggered events during the January 2005 slow-slip event were located on a subhorizontal plane at a depth of 7.5  1 km. This is therefore the most likely depth of the slow-slip events. We thank Cecily J. Wolfe for pointing out the error in the original Fig. 1.

  7. Swift Gamma-Ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2004-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up with narrow field instruments capable of multi-wavelength (UV, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space-based observatories drive the end-to-end data analysis and distribution requirements.

  8. Swift Gamma-ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2005-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up with narrow field instruments capable of multi-wavelength (UT, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space-based observatories drive the end-to-end data analysis and distribution requirements.

  9. The Earth Observatory Natural Event Tracker (EONET): An API for Matching Natural Events to GIBS Imagery

    NASA Astrophysics Data System (ADS)

    Ward, K.

    2015-12-01

    Hidden within the terabytes of imagery in NASA's Global Imagery Browse Services (GIBS) collection are hundreds of daily natural events. Some events are newsworthy, devastating, and visibly obvious at a global scale, others are merely regional curiosities. Regardless of the scope and significance of any one event, it is likely that multiple GIBS layers can be viewed to provide a multispectral, dataset-based view of the event. To facilitate linking between the discrete event and the representative dataset imagery, NASA's Earth Observatory Group has developed a prototype application programming interface (API): the Earth Observatory Natural Event Tracker (EONET). EONET supports an API model that allows users to retrieve event-specific metadata--date/time, location, and type (wildfire, storm, etc.)--and web service layer-specific metadata which can be used to link to event-relevant dataset imagery in GIBS. GIBS' ability to ingest many near real time datasets, combined with its growing archive of past imagery, means that API users will be able to develop client applications that not only show ongoing events but can also look at imagery from before and after. In our poster, we will present the API and show examples of its use.

  10. "What" and "where" was when? Memory for the temporal order of episodic events in children.

    PubMed

    Scarf, Damian; Boden, Hannah; Labuschagne, Lisa G; Gross, Julien; Hayne, Harlene

    2017-12-01

    In the past, researchers have shown that the individual components of episodic memory (i.e "what," "where," and "when") may emerge at different points in development. Specifically, while children as young as three can accurately report the "what" and "where" of an event, they struggle to accurately report when the event occurred. One explanation for children's difficulty in reporting when an event took place is a rudimentary understanding, and ability to use, temporal terms. In the current experiment, we employed a physical timeline to aid children's reporting of the order in which a series of episodic events occurred. Overall, while 4-, 5-, and 6-year olds performed above chance, 3-year olds did not. Our findings suggest that 3-year olds' limited ability to produce temporal terms may not be the rate-limiting step preventing them from identifying when events occurred in their recent past. © 2017 Wiley Periodicals, Inc.

  11. Automated location detection of injection site for preclinical stereotactic neurosurgery procedure

    NASA Astrophysics Data System (ADS)

    Abbaszadeh, Shiva; Wu, Hemmings C. H.

    2017-03-01

    Currently, during stereotactic neurosurgery procedures, the manual task of locating the proper area for needle insertion or implantation of electrode/cannula/optic fiber can be time consuming. The requirement of the task is to quickly and accurately find the location for insertion. In this study we investigate an automated method to locate the entry point of region of interest. This method leverages a digital image capture system, pattern recognition, and motorized stages. Template matching of known anatomical identifiable regions is used to find regions of interest (e.g. Bregma) in rodents. For our initial study, we tackle the problem of automatically detecting the entry point.

  12. A convenient and accurate parallel Input/Output USB device for E-Prime.

    PubMed

    Canto, Rosario; Bufalari, Ilaria; D'Ausilio, Alessandro

    2011-03-01

    Psychological and neurophysiological experiments require the accurate control of timing and synchrony for Input/Output signals. For instance, a typical Event-Related Potential (ERP) study requires an extremely accurate synchronization of stimulus delivery with recordings. This is typically done via computer software such as E-Prime, and fast communications are typically assured by the Parallel Port (PP). However, the PP is an old and disappearing technology that, for example, is no longer available on portable computers. Here we propose a convenient USB device enabling parallel I/O capabilities. We tested this device against the PP on both a desktop and a laptop machine in different stress tests. Our data demonstrate the accuracy of our system, which suggests that it may be a good substitute for the PP with E-Prime.

  13. Walking through doorways causes forgetting: Event structure or updating disruption?

    PubMed

    Pettijohn, Kyle A; Radvansky, Gabriel A

    2016-11-01

    According to event cognition theory, people segment experience into separate event models. One consequence of this segmentation is that when people transport objects from one location to another, memory is worse than if people move across a large location. In two experiments participants navigated through a virtual environment, and recognition memory was tested in either the presence or the absence of a location shift for objects that were recently interacted with (i.e., just picked up or set down). Of particular concern here is whether this location updating effect is due to (a) differences in retention intervals as a result of the navigation process, (b) a temporary disruption in cognitive processing that may occur as a result of the updating processes, or (c) a need to manage multiple event models, as has been suggested in prior research. Experiment 1 explored whether retention interval is driving this effect by recording travel times from the acquisition of an object and the probe time. The results revealed that travel times were similar, thereby rejecting a retention interval explanation. Experiment 2 explored whether a temporary disruption in processing is producing the effect by introducing a 3-second delay prior to the presentation of a memory probe. The pattern of results was not affected by adding a delay, thereby rejecting a temporary disruption account. These results are interpreted in the context of the event horizon model, which suggests that when there are multiple event models that contain common elements there is interference at retrieval, which compromises performance.

  14. Event-driven charge-coupled device design and applications therefor

    NASA Technical Reports Server (NTRS)

    Doty, John P. (Inventor); Ricker, Jr., George R. (Inventor); Burke, Barry E. (Inventor); Prigozhin, Gregory Y. (Inventor)

    2005-01-01

    An event-driven X-ray CCD imager device uses a floating-gate amplifier or other non-destructive readout device to non-destructively sense a charge level in a charge packet associated with a pixel. The output of the floating-gate amplifier is used to identify each pixel that has a charge level above a predetermined threshold. If the charge level is above a predetermined threshold the charge in the triggering charge packet and in the charge packets from neighboring pixels need to be measured accurately. A charge delay register is included in the event-driven X-ray CCD imager device to enable recovery of the charge packets from neighboring pixels for accurate measurement. When a charge packet reaches the end of the charge delay register, control logic either dumps the charge packet, or steers the charge packet to a charge FIFO to preserve it if the charge packet is determined to be a packet that needs accurate measurement. A floating-diffusion amplifier or other low-noise output stage device, which converts charge level to a voltage level with high precision, provides final measurement of the charge packets. The voltage level is eventually digitized by a high linearity ADC.

  15. Selection of monitoring locations for storm water quality assessment.

    PubMed

    Langeveld, J G; Boogaard, F; Liefting, H J; Schilperoort, R P S; Hof, A; Nijhof, H; de Ridder, A C; Kuiper, M W

    2014-01-01

    Storm water runoff is a major contributor to the pollution of receiving waters. Storm water characteristics may vary significantly between locations and events. Hence, for each given location, this necessitates a well-designed monitoring campaign prior to selection of an appropriate storm water management strategy. The challenge for the design of a monitoring campaign with a given budget is to balance detailed monitoring at a limited number of locations versus less detailed monitoring at a large number of locations. This paper proposes a methodology for the selection of monitoring locations for storm water quality monitoring, based on (pre-)screening, a quick scan monitoring campaign, and final selection of locations and design of the monitoring setup. The main advantage of the method is the ability to prevent the selection of monitoring locations that turn out to be inappropriate. In addition, in this study, the quick scan resulted in a first useful dataset on storm water quality and a strong indication of illicit connections at one of the monitoring locations.

  16. An Evaluation of Three Interdisciplinary Social Science Events outside of the College Classroom

    ERIC Educational Resources Information Center

    Knapp, Sarah; Merges, Renee

    2017-01-01

    This article describes three interdisciplinary events held outside of the classroom to examine social psychological concepts in the criminal justice system, with undergraduate students enrolled in criminal justice and psychology courses. These events can most accurately be described as using a synthetic interdisciplinary approach, in which the…

  17. An Impact-Location Estimation Algorithm for Subsonic Uninhabited Aircraft

    NASA Technical Reports Server (NTRS)

    Bauer, Jeffrey E.; Teets, Edward

    1997-01-01

    An impact-location estimation algorithm is being used at the NASA Dryden Flight Research Center to support range safety for uninhabited aerial vehicle flight tests. The algorithm computes an impact location based on the descent rate, mass, and altitude of the vehicle and current wind information. The predicted impact location is continuously displayed on the range safety officer's moving map display so that the flightpath of the vehicle can be routed to avoid ground assets if the flight must be terminated. The algorithm easily adapts to different vehicle termination techniques and has been shown to be accurate to the extent required to support range safety for subsonic uninhabited aerial vehicles. This paper describes how the algorithm functions, how the algorithm is used at NASA Dryden, and how various termination techniques are handled by the algorithm. Other approaches to predicting the impact location and the reasons why they were not selected for real-time implementation are also discussed.

  18. Locating articular cartilage in MR images

    NASA Astrophysics Data System (ADS)

    Folkesson, Jenny; Dam, Erik; Pettersen, Paola; Olsen, Ole F.; Nielsen, Mads; Christiansen, Claus

    2005-04-01

    Accurate computation of the thickness of the articular cartilage is of great importance when diagnosing and monitoring the progress of joint diseases such as osteoarthritis. A fully automated cartilage assessment method is preferable compared to methods using manual interaction in order to avoid inter- and intra-observer variability. As a first step in the cartilage assessment, we present an automatic method for locating articular cartilage in knee MRI using supervised learning. The next step will be to fit a variable shape model to the cartilage, initiated at the location found using the method presented in this paper. From the model, disease markers will be extracted for the quantitative evaluation of the cartilage. The cartilage is located using an ANN-classifier, where every voxel is classified as cartilage or non-cartilage based on prior knowledge of the cartilage structure. The classifier is tested using leave-one-out-evaluation, and we found the average sensitivity and specificity to be 91.0% and 99.4%, respectively. The center of mass calculated from voxels classified as cartilage are similar to the corresponding values calculated from manual segmentations, which confirms that this method can find a good initial position for a shape model.

  19. Event detection in an assisted living environment.

    PubMed

    Stroiescu, Florin; Daly, Kieran; Kuris, Benjamin

    2011-01-01

    This paper presents the design of a wireless event detection and in building location awareness system. The systems architecture is based on using a body worn sensor to detect events such as falls where they occur in an assisted living environment. This process involves developing event detection algorithms and transmitting such events wirelessly to an in house network based on the 802.15.4 protocol. The network would then generate alerts both in the assisted living facility and remotely to an offsite monitoring facility. The focus of this paper is on the design of the system architecture and the compliance challenges in applying this technology.

  20. Intentional forgetting diminishes memory for continuous events.

    PubMed

    Fawcett, Jonathan M; Taylor, Tracy L; Nadel, Lynn

    2013-01-01

    In a novel event method directed forgetting task, instructions to Remember (R) or Forget (F) were integrated throughout the presentation of four videos depicting common events (e.g., baking cookies). Participants responded more accurately to cued recall questions (E1) and true/false statements (E2-4) regarding R segments than F segments. This was true even when forced to attend to F segments by virtue of having to perform concurrent discrimination (E2) or conceptual segmentation (E3) tasks. The final experiment (E5) demonstrated a larger R >F difference for specific true/false statements (the woman added three cups of flour) than for general true/false statements (the woman added flour) suggesting that participants likely encoded and retained at least a general representation of the events they had intended to forget, even though this representation was not as specific as the representation of events they had intended to remember.

  1. Development and Assessment of Memorial Sloan Kettering Cancer Center’s Surgical Secondary Events Grading System

    PubMed Central

    Strong, Vivian E.; Selby, Luke V.; Sovel, Mindy; Disa, Joseph J.; Hoskins, William; DeMatteo, Ronald; Scardino, Peter; Jaques, David P.

    2015-01-01

    Background Studying surgical secondary events is an evolving effort with no current established system for database design, standard reporting, or definitions. Using the Clavien-Dindo classification as a guide, in 2001 we developed a Surgical Secondary Events database based on grade of event and required intervention to begin prospectively recording and analyzing all surgical secondary events (SSE). Study Design Events are prospectively entered into the database by attending surgeons, house staff, and research staff. In 2008 we performed a blinded external audit of 1,498 operations that were randomly selected to examine the quality and reliability of the data. Results 1,498 of 4,284 operations during the 3rd quarter of 2008 were audited. 79% (N=1,180) of the operations did not have a secondary event while 21% (N=318) of operations had an identified event. 91% (1,365) of operations were correctly entered into the SSE database. 97% (129/133) of missed secondary events were Grades I and II. Three Grade III (2%) and one Grade IV (1%) secondary event were missed. There were no missed Grade 5 secondary events. Conclusion Grade III – IV events are more accurately collected than Grade I – II events. Robust and accurate secondary events data can be collected by clinicians and research staff and these data can safely be used for quality improvement projects and research. PMID:25319579

  2. Accurate Land Company, Inc., Acadia Subdivision, Plat 1 and Plat 2 - Clean Water Act Public Notice

    EPA Pesticide Factsheets

    The EPA is providing notice of an Administrative Penalty Assessment in the form of an Expedited Storm Water Settlement Agreement against Accurate Land Company, Inc., a business located at 12035 University Ave., Suite 100, Clive, IA 50235, for alleged viola

  3. Absolute locations of the North Korean nuclear tests based on differential seismic travel times and InSAR

    NASA Astrophysics Data System (ADS)

    Myers, S. C.; Ford, S. R.; Mellors, R. J.; Ichinose, G.

    2017-12-01

    We use constraints on the location of the January 6, 2016 DPRK announced nuclear test (2016_01) and differential travel times for Pn, Pg, and teleseismic P-waves to estimate the absolute locations of the 6 announced DPRK nuclear tests, as well as other nearby events. Absolute location constraints are based on the fit of commercial InSAR-derived ground displacement and predictions of elastic displacement from an isotropic source including topographic effects. Results show that the announced tests in January and September of 2016 are under the crest of highest local topography (Mt. Mantap), while the 2009 and 2013 events are south of the topographic crest at a similar contour in local topography. The first announced test in 2006 was located near the crest of a separate topographic high approximately 2.75 km east of the 2016_01 test. The September 3, 2017 event is approximately between the two 2016 tests, under the crest of the mountain ridge. Constraints from seismic data put the events within 1 km of the surface and depths may be inferred, with caution, by differencing the elevation of tunnel entrances and the topographic surface and accounting for the rise in a tunnel elevation from the entrance to facilitate drainage. Depths for the 2006_10, 2009_05, 2013_02, 2016_01, 2016_09, and 2017_09 tests are estimated to be 500 m, 530 m, 530 m, 740 m, 750 m, and 750 m, respectively. Other nearby events are considerably lower in magnitude, resulting in location estimates that are not as well constrained as the announced nuclear tests. Analysis of all events provides a bulletin of events that may occur in the future. Prepared by LLNL under Contract DE-AC52-07NA27344.

  4. AIC-based diffraction stacking for local earthquake locations at the Sumatran Fault (Indonesia)

    NASA Astrophysics Data System (ADS)

    Hendriyana, Andri; Bauer, Klaus; Muksin, Umar; Weber, Michael

    2018-05-01

    We present a new workflow for the localization of seismic events which is based on a diffraction stacking approach. In order to address the effects from complex source radiation patterns, we suggest to compute diffraction stacking from a characteristic function (CF) instead of stacking the original waveform data. A new CF, which is called in the following mAIC (modified from Akaike Information Criterion) is proposed. We demonstrate that both P- and S-wave onsets can be detected accurately. To avoid cross-talk between P and S waves due to inaccurate velocity models, we separate the P and S waves from the mAIC function by making use of polarization attributes. Then, the final image function is represented by the largest eigenvalue as a result of the covariance analysis between P- and S-image functions. Results from synthetic experiments show that the proposed diffraction stacking provides reliable results. The workflow of the diffraction stacking method was finally applied to local earthquake data from Sumatra, Indonesia. Recordings from a temporary network of 42 stations deployed for nine months around the Tarutung pull-apart basin were analysed. The seismic event locations resulting from the diffraction stacking method align along a segment of the Sumatran Fault. A more complex distribution of seismicity is imaged within and around the Tarutung basin. Two lineaments striking N-S were found in the centre of the Tarutung basin which support independent results from structural geology.

  5. Achieving perceptually-accurate aural telepresence

    NASA Astrophysics Data System (ADS)

    Henderson, Paul D.

    Immersive multimedia requires not only realistic visual imagery but also a perceptually-accurate aural experience. A sound field may be presented simultaneously to a listener via a loudspeaker rendering system using the direct sound from acoustic sources as well as a simulation or "auralization" of room acoustics. Beginning with classical Wave-Field Synthesis (WFS), improvements are made to correct for asymmetries in loudspeaker array geometry. Presented is a new Spatially-Equalized WFS (SE-WFS) technique to maintain the energy-time balance of a simulated room by equalizing the reproduced spectrum at the listener for a distribution of possible source angles. Each reproduced source or reflection is filtered according to its incidence angle to the listener. An SE-WFS loudspeaker array of arbitrary geometry reproduces the sound field of a room with correct spectral and temporal balance, compared with classically-processed WFS systems. Localization accuracy of human listeners in SE-WFS sound fields is quantified by psychoacoustical testing. At a loudspeaker spacing of 0.17 m (equivalent to an aliasing cutoff frequency of 1 kHz), SE-WFS exhibits a localization blur of 3 degrees, nearly equal to real point sources. Increasing the loudspeaker spacing to 0.68 m (for a cutoff frequency of 170 Hz) results in a blur of less than 5 degrees. In contrast, stereophonic reproduction is less accurate with a blur of 7 degrees. The ventriloquist effect is psychometrically investigated to determine the effect of an intentional directional incongruence between audio and video stimuli. Subjects were presented with prerecorded full-spectrum speech and motion video of a talker's head as well as broadband noise bursts with a static image. The video image was displaced from the audio stimulus in azimuth by varying amounts, and the perceived auditory location measured. A strong bias was detectable for small angular discrepancies between audio and video stimuli for separations of less than 8

  6. Calibrated Multiple Event Relocations of the Central and Eastern United States

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Benz, H.; McNamara, D. E.; Bergman, E.; Herrmann, R. B.; Myers, S. C.

    2015-12-01

    Earthquake locations are a first-order observable which form the basis of a wide range of seismic analyses. Currently, the ANSS catalog primarily contains published single-event earthquake locations that rely on assumed 1D velocity models. Increasing the accuracy of cataloged earthquake hypocenter locations and origin times and constraining their associated errors can improve our understanding of Earth structure and have a fundamental impact on subsequent seismic studies. Multiple-event relocation algorithms often increase the precision of relative earthquake hypocenters but are hindered by their limited ability to provide realistic location uncertainties for individual earthquakes. Recently, a Bayesian approach to the multiple event relocation problem has proven to have many benefits including the ability to: (1) handle large data sets; (2) easily incorporate a priori hypocenter information; (3) model phase assignment errors; and, (4) correct for errors in the assumed travel time model. In this study we employ bayseloc [Myers et al., 2007, 2009] to relocate earthquakes in the Central and Eastern United States from 1964-present. We relocate ~11,000 earthquakes with a dataset of ~439,000 arrival time observations. Our dataset includes arrival-time observations from the ANSS catalog supplemented with arrival-time data from the Reviewed ISC Bulletin (prior to 1981), targeted local studies, and arrival-time data from the TA Array. One significant benefit of the bayesloc algorithm is its ability to incorporate a priori constraints on the probability distributions of specific earthquake locations parameters. To constrain the inversion, we use high-quality calibrated earthquake locations from local studies, including studies from: Raton Basin, Colorado; Mineral, Virginia; Guy, Arkansas; Cheneville, Quebec; Oklahoma; and Mt. Carmel, Illinois. We also add depth constraints to 232 earthquakes from regional moment tensors. Finally, we add constraints from four historic (1964

  7. Hydromagnetic vortices. II - Further dawnside events

    NASA Technical Reports Server (NTRS)

    Saunders, M. A.; Southwood, D. J.; Hones, E. W., Jr.

    1983-01-01

    It is shown that the 11 December 1977 plasma vortex event the subject of a multi-instrument investigation (Saunders et al., 1983) - was neither atypical nor uncommon, by describing the magnetic and plasma characteristics of three further vortices recorded within 3 weeks of, and at similar locations to, the 11 December study. One of the new events has added interest since magnetic pulsations were seen simultaneously on the ground in the vicinity of the satellite magnetic 'footprint'.

  8. Using a modified time-reverse imaging technique to locate low-frequency earthquakes on the San Andreas Fault near Cholame, California

    USGS Publications Warehouse

    Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.

    2015-01-01

    We present a new method to locate low-frequency earthquakes (LFEs) within tectonic tremor episodes based on time-reverse imaging techniques. The modified time-reverse imaging technique presented here is the first method that locates individual LFEs within tremor episodes within 5 km uncertainty without relying on high-amplitude P-wave arrivals and that produces similar hypocentral locations to methods that locate events by stacking hundreds of LFEs without having to assume event co-location. In contrast to classic time-reverse imaging algorithms, we implement a modification to the method that searches for phase coherence over a short time period rather than identifying the maximum amplitude of a superpositioned wavefield. The method is independent of amplitude and can help constrain event origin time. The method uses individual LFE origin times, but does not rely on a priori information on LFE templates and families.We apply the method to locate 34 individual LFEs within tremor episodes that occur between 2010 and 2011 on the San Andreas Fault, near Cholame, California. Individual LFE location accuracies range from 2.6 to 5 km horizontally and 4.8 km vertically. Other methods that have been able to locate individual LFEs with accuracy of less than 5 km have mainly used large-amplitude events where a P-phase arrival can be identified. The method described here has the potential to locate a larger number of individual low-amplitude events with only the S-phase arrival. Location accuracy is controlled by the velocity model resolution and the wavelength of the dominant energy of the signal. Location results are also dependent on the number of stations used and are negligibly correlated with other factors such as the maximum gap in azimuthal coverage, source–station distance and signal-to-noise ratio.

  9. Accurate determination of segmented X-ray detector geometry

    PubMed Central

    Yefanov, Oleksandr; Mariani, Valerio; Gati, Cornelius; White, Thomas A.; Chapman, Henry N.; Barty, Anton

    2015-01-01

    Recent advances in X-ray detector technology have resulted in the introduction of segmented detectors composed of many small detector modules tiled together to cover a large detection area. Due to mechanical tolerances and the desire to be able to change the module layout to suit the needs of different experiments, the pixels on each module might not align perfectly on a regular grid. Several detectors are designed to permit detector sub-regions (or modules) to be moved relative to each other for different experiments. Accurate determination of the location of detector elements relative to the beam-sample interaction point is critical for many types of experiment, including X-ray crystallography, coherent diffractive imaging (CDI), small angle X-ray scattering (SAXS) and spectroscopy. For detectors with moveable modules, the relative positions of pixels are no longer fixed, necessitating the development of a simple procedure to calibrate detector geometry after reconfiguration. We describe a simple and robust method for determining the geometry of segmented X-ray detectors using measurements obtained by serial crystallography. By comparing the location of observed Bragg peaks to the spot locations predicted from the crystal indexing procedure, the position, rotation and distance of each module relative to the interaction region can be refined. We show that the refined detector geometry greatly improves the results of experiments. PMID:26561117

  10. Accurate determination of segmented X-ray detector geometry

    DOE PAGES

    Yefanov, Oleksandr; Mariani, Valerio; Gati, Cornelius; ...

    2015-10-22

    Recent advances in X-ray detector technology have resulted in the introduction of segmented detectors composed of many small detector modules tiled together to cover a large detection area. Due to mechanical tolerances and the desire to be able to change the module layout to suit the needs of different experiments, the pixels on each module might not align perfectly on a regular grid. Several detectors are designed to permit detector sub-regions (or modules) to be moved relative to each other for different experiments. Accurate determination of the location of detector elements relative to the beam-sample interaction point is critical formore » many types of experiment, including X-ray crystallography, coherent diffractive imaging (CDI), small angle X-ray scattering (SAXS) and spectroscopy. For detectors with moveable modules, the relative positions of pixels are no longer fixed, necessitating the development of a simple procedure to calibrate detector geometry after reconfiguration. We describe a simple and robust method for determining the geometry of segmented X-ray detectors using measurements obtained by serial crystallography. By comparing the location of observed Bragg peaks to the spot locations predicted from the crystal indexing procedure, the position, rotation and distance of each module relative to the interaction region can be refined. Furthermore, we show that the refined detector geometry greatly improves the results of experiments.« less

  11. High Resolution Hypocenter Relocation for Events in Central Java, Indonesia using Double-Difference Technique

    NASA Astrophysics Data System (ADS)

    Sahara, D. P.; Widiyantoro, S.; Nugraha, A. D.; Sule, R.; Luehr, B. G.

    2010-12-01

    Seismic and volcanic activities in Central Java are highly related to the subduction of the Indo-Australian plate. In the MERapi AMphibious Experiments (MERAMEX), a network consisting of 169 seismographic stations was installed onshore and offshore in central Java and recorded 282 events during the operation. In this study, we present the results of relative hypocenters relocation by using Double Difference (DD) method to image the subduction beneath the volcanic chain in central Java. The DD method is an iterative procedure using Least Square optimization to determine high-resolution hypocenter locations over large distances. This relocation method uses absolute travel-time measurements and/or cross-correlation of P- and S-wave differential travel-time measurements. The preliminary results of our study showed that the algorithm could collapse the diffused event locations obtained from previous study into a sharp image of seismicity structure and reduce the residual travel time errors significantly (7 - 60%). As a result, narrow regions of a double seismic zone which correlated with the subducting slab can be determined more accurately. The dip angle of the slab increases gradually from almost horizontal beneath offshore to very steep (65-80 degrees) beneath the northern part of central Java. The aseismic gap at depths of 140 km - 185 km is also depicted clearly. The next step of the ongoing research is to provide detailed quantitative constraints on the structures of the mantle wedge and crust beneath central Java and to show the ascending paths of fluids and partially molten materials below the volcanic arc by applying Double-Difference Tomography method (TomoDD).

  12. Location estimation in wireless sensor networks using spring-relaxation technique.

    PubMed

    Zhang, Qing; Foh, Chuan Heng; Seet, Boon-Chong; Fong, A C M

    2010-01-01

    Accurate and low-cost autonomous self-localization is a critical requirement of various applications of a large-scale distributed wireless sensor network (WSN). Due to its massive deployment of sensors, explicit measurements based on specialized localization hardware such as the Global Positioning System (GPS) is not practical. In this paper, we propose a low-cost WSN localization solution. Our design uses received signal strength indicators for ranging, light weight distributed algorithms based on the spring-relaxation technique for location computation, and the cooperative approach to achieve certain location estimation accuracy with a low number of nodes with known locations. We provide analysis to show the suitability of the spring-relaxation technique for WSN localization with cooperative approach, and perform simulation experiments to illustrate its accuracy in localization.

  13. Identification of the "minimal triangle" and other common event-to-event transitions in conflict and containment incidents.

    PubMed

    Bowers, Len; James, Karen; Quirk, Alan; Wright, Steve; Williams, Hilary; Stewart, Duncan

    2013-07-01

    Although individual conflict and containment events among acute psychiatric inpatients have been studied in some detail, the relationship of these events to each other has not. In particular, little is known about the temporal order of events for individual patients. This study aimed to identify the most common pathways from event to event. A sample of 522 patients was recruited from 84 acute psychiatric wards in 31 hospital locations in London and the surrounding areas during 2009-2010. Data on the order of conflict and containment events were collected for the first two weeks of admission from patients' case notes. Event-to-event transitions were tabulated and depicted diagrammatically. Event types were tested for their most common temporal placing in sequences of events. Most conflict and containment occurs within and between events of the minimal triangle (verbal aggression, de-escalation, and PRN medication), and the majority of these event sequences conclude in no further events; a minority transition to other, more severe, events. Verbal abuse and medication refusal were more likely to start sequences of disturbed behaviour. Training in the prevention and management of violence needs to acknowledge that a gradual escalation of patient behaviour does not always occur. Verbal aggression is a critical initiator of conflict events, and requires more detailed and sustained research on optimal management and prevention strategies. Similar research is required into medication refusal by inpatients.

  14. Recording Adverse Events Following Joint Arthroplasty: Financial Implications and Validation of an Adverse Event Assessment Form.

    PubMed

    Lee, Matthew J; Mohamed, Khalid M S; Kelly, John C; Galbraith, John G; Street, John; Lenehan, Brian J

    2017-09-01

    In Ireland, funding of joint arthroplasty procedures has moved to a pay-by-results national tariff system. Typically, adverse clinical events are recorded via retrospective chart-abstraction methods by administrative staff. Missed or undocumented events not only affect the quality of patient care but also may unrealistically skew budgetary decisions that impact fiscal viability of the service. Accurate recording confers clinical benefits and financial transparency. The aim of this study was to compare a prospectively implemented adverse events form with the current national retrospective chart-abstraction method in terms of pay-by-results financial implications. An adverse events form adapted from a similar validated model was used to prospectively record complications in 51 patients undergoing total hip or knee arthroplasties. Results were compared with the same cohort using an existing data abstraction method. Both data sets were coded in accordance with current standards for case funding. Overall, 114 events were recorded during the study through prospective charting of adverse events, compared with 15 events documented by customary method (a significant discrepancy). Wound drainage (15.8%) was the most common complication, followed by anemia (7.9%), lower respiratory tract infections (7.9%), and cardiac events (7%). A total of €61,956 ($67,778) in missed funding was calculated as a result. This pilot study demonstrates the ability to improve capture of adverse events through use of a well-designed assessment form. Proper perioperative data handling is a critical aspect of financial subsidies, enabling optimal allocation of funds. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Method and apparatus for accurately manipulating an object during microelectrophoresis

    DOEpatents

    Parvin, Bahram A.; Maestre, Marcos F.; Fish, Richard H.; Johnston, William E.

    1997-01-01

    An apparatus using electrophoresis provides accurate manipulation of an object on a microscope stage for further manipulations add reactions. The present invention also provides an inexpensive and easily accessible means to move an object without damage to the object. A plurality of electrodes are coupled to the stage in an array whereby the electrode array allows for distinct manipulations of the electric field for accurate manipulations of the object. There is an electrode array control coupled to the plurality of electrodes for manipulating the electric field. In an alternative embodiment, a chamber is provided on the stage to hold the object. The plurality of electrodes are positioned in the chamber, and the chamber is filled with fluid. The system can be automated using visual servoing, which manipulates the control parameters, i.e., x, y stage, applying the field, etc., after extracting the significant features directly from image data. Visual servoing includes an imaging device and computer system to determine the location of the object. A second stage having a plurality of tubes positioned on top of the second stage, can be accurately positioned by visual servoing so that one end of one of the plurality of tubes surrounds at least part of the object on the first stage.

  16. Method and apparatus for accurately manipulating an object during microelectrophoresis

    DOEpatents

    Parvin, B.A.; Maestre, M.F.; Fish, R.H.; Johnston, W.E.

    1997-09-23

    An apparatus using electrophoresis provides accurate manipulation of an object on a microscope stage for further manipulations and reactions. The present invention also provides an inexpensive and easily accessible means to move an object without damage to the object. A plurality of electrodes are coupled to the stage in an array whereby the electrode array allows for distinct manipulations of the electric field for accurate manipulations of the object. There is an electrode array control coupled to the plurality of electrodes for manipulating the electric field. In an alternative embodiment, a chamber is provided on the stage to hold the object. The plurality of electrodes are positioned in the chamber, and the chamber is filled with fluid. The system can be automated using visual servoing, which manipulates the control parameters, i.e., x, y stage, applying the field, etc., after extracting the significant features directly from image data. Visual servoing includes an imaging device and computer system to determine the location of the object. A second stage having a plurality of tubes positioned on top of the second stage, can be accurately positioned by visual servoing so that one end of one of the plurality of tubes surrounds at least part of the object on the first stage. 11 figs.

  17. Improved Infrasound Event Location

    DTIC Science & Technology

    2007-09-01

    Bolide (20) —— 1 signal —— 5-8 signals Mine Explosion (112) Volcano (20) —— 2 signals —— >8 signals Rocket Motor Test (1) Landslide (1) —— 3-4...significant bookkeeping, since the ray-tracing programs must be executed separately for each source-receiver-model scenario, each producing multiple...Infrasound monitoring of volcanoes to probe high-altitude winds, J. Geophys. Res. 110, D13106, doi: 10.1029/2004JD005587. Le Pichon, A., K

  18. Location-assured, multifactor authentication on smartphones via LTE communication

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham

    2013-05-01

    With the added security provided by LTE, geographical location has become an important factor for authentication to enhance the security of remote client authentication during mCommerce applications using Smartphones. Tight combination of geographical location with classic authentication factors like PINs/Biometrics in a real-time, remote verification scheme over the LTE layer connection assures the authenticator about the client itself (via PIN/biometric) as well as the client's current location, thus defines the important aspects of "who", "when", and "where" of the authentication attempt without eaves dropping or man on the middle attacks. To securely integrate location as an authentication factor into the remote authentication scheme, client's location must be verified independently, i.e. the authenticator should not solely rely on the location determined on and reported by the client's Smartphone. The latest wireless data communication technology for mobile phones (4G LTE, Long-Term Evolution), recently being rolled out in various networks, can be employed to enhance this location-factor requirement of independent location verification. LTE's Control Plane LBS provisions, when integrated with user-based authentication and independent source of localisation factors ensures secure efficient, continuous location tracking of the Smartphone. This feature can be performed during normal operation of the LTE-based communication between client and network operator resulting in the authenticator being able to verify the client's claimed location more securely and accurately. Trials and experiments show that such algorithm implementation is viable for nowadays Smartphone-based banking via LTE communication.

  19. 32 CFR 705.34 - Other special events.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Affairs. Based on the importance of the event (nationally, regionally, or locally) location, and... promotion of esprit de corps, and are conducted primarily for active duty personnel and their guests. (4...

  20. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence

    PubMed Central

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images. PMID:26869966

  1. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence.

    PubMed

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  2. Rare event simulation in radiation transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less

  3. First LOCSMITH locations of deep moonquakes

    NASA Astrophysics Data System (ADS)

    Hempel, S.; Knapmeyer, M.; Sens-Schönfelder, C.; Oberst, J.

    2008-09-01

    Introduction Several thousand seismic events were recorded by the Apollo seismic network from 19691977. Different types of events can be distinguished: meteoroid impacts, thermal quakes and internally caused moonquakes. The latter subdivide into shallow (100 to 300km) and deep moonquakes (700 to 1100km), which are by far the most common events. The deep quakes would be no immediate danger to inhabitated stations on the Earth's Moon because of their relatively low magnitude and great depth. However, they bear important information on lunar structure and evolution, and their distribution probably reflects their source mechanism. In this study, we reinvestigate location patterns of deep lunar quakes. LOCSMITH The core of this study is a new location method (LOCSMITH, [1]). This algorithm uses time intervals rather than time instants as input, which contain the dedicated arrival with probability 1. LOCSMITH models and compares theoretical and actual travel times on a global scale and uses an adaptive grid to search source locations compatible with all observations. The output is a set of all possible hypocenters for the considered region of repeating, tidally triggered moonquake activity, called clusters. The shape and size of these sets gives a better estimate of the location uncertainty than the formal standard deviations returned by classical methods. This is used for grading of deep moonquake clusters according to the currently available data quality. Classification of deep moonquakes As first step, we establish a reciprocal dependence of size and shape of LOCSMITH location clouds on number of arrivals. Four different shapes are recognized, listed here in an order corresponding to decreasing spatial resolution: 1. "Balls", which are well defined and relatively small types of sets resembling the commonly assumed error ellipsoid. These are found in the best cases with many observations. Locations in this shape are obtained for clusters 1, 18 or 33, these were already

  4. Improving automatic earthquake locations in subduction zones: a case study for GEOFON catalog of Tonga-Fiji region

    NASA Astrophysics Data System (ADS)

    Nooshiri, Nima; Heimann, Sebastian; Saul, Joachim; Tilmann, Frederik; Dahm, Torsten

    2015-04-01

    Automatic earthquake locations are sometimes associated with very large residuals up to 10 s even for clear arrivals, especially for regional stations in subduction zones because of their strongly heterogeneous velocity structure associated. Although these residuals are most likely not related to measurement errors but unmodelled velocity heterogeneity, these stations are usually removed from or down-weighted in the location procedure. While this is possible for large events, it may not be useful if the earthquake is weak. In this case, implementation of travel-time station corrections may significantly improve the automatic locations. Here, the shrinking box source-specific station term method (SSST) [Lin and Shearer, 2005] has been applied to improve relative location accuracy of 1678 events that occurred in the Tonga subduction zone between 2010 and mid-2014. Picks were obtained from the GEOFON earthquake bulletin for all available station networks. We calculated a set of timing corrections for each station which vary as a function of source position. A separate time correction was computed for each source-receiver path at the given station by smoothing the residual field over nearby events. We begin with a very large smoothing radius essentially encompassing the whole event set and iterate by progressively shrinking the smoothing radius. In this way, we attempted to correct for the systematic errors, that are introduced into the locations by the inaccuracies in the assumed velocity structure, without solving for a new velocity model itself. One of the advantages of the SSST technique is that the event location part of the calculation is separate from the station term calculation and can be performed using any single event location method. In this study, we applied a non-linear, probabilistic, global-search earthquake location method using the software package NonLinLoc [Lomax et al., 2000]. The non-linear location algorithm implemented in NonLinLoc is less

  5. MECH: Algorithms and Tools for Automated Assessment of Potential Attack Locations

    DTIC Science & Technology

    2015-10-06

    conscious and subconscious processing of the geometric structure of the local terrain, sight lines to prominent or useful terrain features, proximity...This intuition or instinct is the outcome of an unconscious or subconscious integration of available facts and impressions. Thus, in the search...adjacency. Even so, we inevitably introduce a bias between events and non-event road locations when calculating the route visibility features. 63

  6. Z-rich solar particle event characteristics 1972-1976

    NASA Technical Reports Server (NTRS)

    Zwickl, R. D.; Roelof, E. C.; Gold, R. E.; Krimigis, S. M.; Armstrong, T. P.

    1978-01-01

    It is found in the reported investigation that Z-rich solar particle events usually have large and prolonged anisotropies in addition to an extremely variable charge composition that varies not only from event to event but also throughout the event. These observations suggest that one can no longer regard the event-averaged composition of solar particle events at low energies as providing an unbiased global sample of the solar atmospheric composition. The variability from event to event and among classes of events is just too great. However, the tendency for the Z-rich events to be associated with both the low-speed solar wind at or just before the onset of solar wind streams and with active regions located in the western hemisphere, indicates that charge composition studies of solar particle events can yield a better knowledge of the flare acceleration process as well as the inhomogeneous nature of magnetic field structure and particle composition in the solar atmosphere.

  7. Support Vector Machine Model for Automatic Detection and Classification of Seismic Events

    NASA Astrophysics Data System (ADS)

    Barros, Vesna; Barros, Lucas

    2016-04-01

    The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support

  8. Near-real time 3D probabilistic earthquakes locations at Mt. Etna volcano

    NASA Astrophysics Data System (ADS)

    Barberi, G.; D'Agostino, M.; Mostaccio, A.; Patane', D.; Tuve', T.

    2012-04-01

    Automatic procedure for locating earthquake in quasi-real time must provide a good estimation of earthquakes location within a few seconds after the event is first detected and is strongly needed for seismic warning system. The reliability of an automatic location algorithm is influenced by several factors such as errors in picking seismic phases, network geometry, and velocity model uncertainties. On Mt. Etna, the seismic network is managed by INGV and the quasi-real time earthquakes locations are performed by using an automatic-picking algorithm based on short-term-average to long-term-average ratios (STA/LTA) calculated from an approximate squared envelope function of the seismogram, which furnish a list of P-wave arrival times, and the location algorithm Hypoellipse, with a 1D velocity model. The main purpose of this work is to investigate the performances of a different automatic procedure to improve the quasi-real time earthquakes locations. In fact, as the automatic data processing may be affected by outliers (wrong picks), the use of a traditional earthquake location techniques based on a least-square misfit function (L2-norm) often yield unstable and unreliable solutions. Moreover, on Mt. Etna, the 1D model is often unable to represent the complex structure of the volcano (in particular the strong lateral heterogeneities), whereas the increasing accuracy in the 3D velocity models at Mt. Etna during recent years allows their use today in routine earthquake locations. Therefore, we selected, as reference locations, all the events occurred on Mt. Etna in the last year (2011) which was automatically detected and located by means of the Hypoellipse code. By using this dataset (more than 300 events), we applied a nonlinear probabilistic earthquake location algorithm using the Equal Differential Time (EDT) likelihood function, (Font et al., 2004; Lomax, 2005) which is much more robust in the presence of outliers in the data. Successively, by using a probabilistic

  9. Special event discrimination analysis: The TEXAR blind test and identification of the August 16, 1997 Kara Sea event. Final report, 13 September 1995--31 January 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgardt, D.

    1998-03-31

    The International Monitoring System (IMS) for the Comprehensive Test Ban Treaty (CTBT) faces the serious challenge of being able to accurately and reliably identify seismic events in any region of the world. Extensive research has been performed in recent years on developing discrimination techniques which appear to classify seismic events into broad categories of source types, such as nuclear explosion, earthquake, and mine blast. This report examines in detail the problem of effectiveness of regional discrimination procedures in the application of waveform discriminants to Special Event identification and the issue of discriminant transportability.

  10. A 3D Tomographic Model of Asia Based on Pn and P Travel Times from GT Events

    NASA Astrophysics Data System (ADS)

    Young, C. J.; Begnaud, M. L.; Ballard, S.; Phillips, W. S.; Hipp, J. R.; Steck, L. K.; Rowe, C. A.; Chang, M. C.

    2008-12-01

    Increasingly, nuclear explosion monitoring is focusing on detection, location, and identification of small events recorded at regional distances. Because Earth structure is highly variable on regional scales, locating events accurately at these distances requires the use of region-specific models to provide accurate travel times. Improved results have been achieved with composites of 1D models and with approximate 3D models with simplified upper mantle structures, but both approaches introduce non-physical boundaries that are problematic for operational monitoring use. Ultimately, what is needed is a true, seamless 3D model of the Earth. Towards that goal, we have developed a 3D tomographic model of the P velocity of the crust and mantle for the Asian continent. Our model is derived by an iterative least squares travel time inversion of more than one million Pn and teleseismic P picks from some 35,000 events recorded at 4,000+ stations. We invert for P velocities from the top of the crust to the core mantle boundary, along with source and receiver static time terms to account for the effects of event mislocation and unaccounted for fine-scale structure near the receiver. Because large portions of the model are under-constrained, we apply spatially varying damping, which constrains the inversion to update the starting model only where good data coverage is available. Our starting crustal model is taken from the a priori crust and upper mantle model of Asia developed through National Nuclear Security Administration laboratory collaboration, which is based on various global and regional studies, and we substantially increase the damping in the crust to discourage changes from this model. Our starting mantle model is AK135. To simplify the inversion, we fix the depths of the major mantle discontinuities (Moho, 410 km, 660 km). 3D rays are calculated using an implementation of the Um and Thurber ray pseudo-bending approach, with full enforcement of Snell's Law in 3D at

  11. Improvements to Earthquake Location with a Fuzzy Logic Approach

    NASA Astrophysics Data System (ADS)

    Gökalp, Hüseyin

    2018-01-01

    In this study, improvements to the earthquake location method were investigated using a fuzzy logic approach proposed by Lin and Sanford (Bull Seismol Soc Am 91:82-93, 2001). The method has certain advantages compared to the inverse methods in terms of eliminating the uncertainties of arrival times and reading errors. In this study, adopting this approach, epicentral locations were determined based on the results of a fuzzy logic space concerning the uncertainties in the velocity models. To map the uncertainties in arrival times into the fuzzy logic space, a trapezoidal membership function was constructed by directly using the travel time difference between the two stations for the P- and S-arrival times instead of the P- and S-wave models to eliminate the need for obtaining information concerning the velocity structure of the study area. The results showed that this method worked most effectively when earthquakes occurred away from a network or when the arrival time data contained phase reading errors. In this study, to resolve the problems related to determining the epicentral locations of the events, a forward modeling method like the grid search technique was used by applying different logical operations (i.e., intersection, union, and their combination) with a fuzzy logic approach. The locations of the events were depended on results of fuzzy logic outputs in fuzzy logic space by searching in a gridded region. The process of location determination with the defuzzification of only the grid points with the membership value of 1 obtained by normalizing all the maximum fuzzy output values of the highest values resulted in more reliable epicentral locations for the earthquakes than the other approaches. In addition, throughout the process, the center-of-gravity method was used as a defuzzification operation.

  12. Optimal Design for Placements of Tsunami Observing Systems to Accurately Characterize the Inducing Earthquake

    NASA Astrophysics Data System (ADS)

    Mulia, Iyan E.; Gusman, Aditya Riadi; Satake, Kenji

    2017-12-01

    Recently, there are numerous tsunami observation networks deployed in several major tsunamigenic regions. However, guidance on where to optimally place the measurement devices is limited. This study presents a methodological approach to select strategic observation locations for the purpose of tsunami source characterizations, particularly in terms of the fault slip distribution. Initially, we identify favorable locations and determine the initial number of observations. These locations are selected based on extrema of empirical orthogonal function (EOF) spatial modes. To further improve the accuracy, we apply an optimization algorithm called a mesh adaptive direct search to remove redundant measurement locations from the EOF-generated points. We test the proposed approach using multiple hypothetical tsunami sources around the Nankai Trough, Japan. The results suggest that the optimized observation points can produce more accurate fault slip estimates with considerably less number of observations compared to the existing tsunami observation networks.

  13. Candidate Binary Microlensing Events from the MACHO Project

    NASA Astrophysics Data System (ADS)

    Becker, A. C.; Alcock, C.; Allsman, R. A.; Alves, D. R.; Axelrod, T. S.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.; King, L. J.; Lehner, M. J.; Marshall, S. L.; Minniti, D.; Peterson, B. A.; Popowski, P.; Pratt, M. R.; Quinn, P. J.; Rodgers, A. W.; Stubbs, C. W.; Sutherland, W.; Tomaney, A.; Vandehei, T.; Welch, D. L.; Baines, D.; Brakel, A.; Crook, B.; Howard, J.; Leach, T.; McDowell, D.; McKeown, S.; Mitchell, J.; Moreland, J.; Pozza, E.; Purcell, P.; Ring, S.; Salmon, A.; Ward, K.; Wyper, G.; Heller, A.; Kaspi, S.; Kovo, O.; Maoz, D.; Retter, A.; Rhie, S. H.; Stetson, P.; Walker, A.; MACHO Collaboration

    1998-12-01

    We present the lightcurves of 22 gravitational microlensing events from the first six years of the MACHO Project gravitational microlensing survey which are likely examples of lensing by binary systems. These events were selected from a total sample of ~ 300 events which were either detected by the MACHO Alert System or discovered through retrospective analyses of the MACHO database. Many of these events appear to have undergone a caustic or cusp crossing, and 2 of the events are well fit with lensing by binary systems with large mass ratios, indicating secondary companions of approximately planetary mass. The event rate is roughly consistent with predictions based upon our knowledge of the properties of binary stars. The utility of binary lensing in helping to solve the Galactic dark matter problem is demonstrated with analyses of 3 binary microlensing events seen towards the Magellanic Clouds. Source star resolution during caustic crossings in 2 of these events allows us to estimate the location of the lensing systems, assuming each source is a single star and not a short period binary. * MACHO LMC-9 appears to be a binary lensing event with a caustic crossing partially resolved in 2 observations. The resulting lens proper motion appears too small for a single source and LMC disk lens. However, it is considerably less likely to be a single source star and Galactic halo lens. We estimate the a priori probability of a short period binary source with a detectable binary character to be ~ 10 %. If the source is also a binary, then we currently have no constraints on the lens location. * The most recent of these events, MACHO 98-SMC-1, was detected in real-time. Follow-up observations by the MACHO/GMAN, PLANET, MPS, EROS and OGLE microlensing collaborations lead to the robust conclusion that the lens likely resides in the SMC.

  14. Microseismic source locations with deconvolution migration

    NASA Astrophysics Data System (ADS)

    Wu, Shaojiang; Wang, Yibo; Zheng, Yikang; Chang, Xu

    2018-03-01

    Identifying and locating microseismic events are critical problems in hydraulic fracturing monitoring for unconventional resources exploration. In contrast to active seismic data, microseismic data are usually recorded with unknown source excitation time and source location. In this study, we introduce deconvolution migration by combining deconvolution interferometry with interferometric cross-correlation migration (CCM). This method avoids the need for the source excitation time and enhances both the spatial resolution and robustness by eliminating the square term of the source wavelets from CCM. The proposed algorithm is divided into the following three steps: (1) generate the virtual gathers by deconvolving the master trace with all other traces in the microseismic gather to remove the unknown excitation time; (2) migrate the virtual gather to obtain a single image of the source location and (3) stack all of these images together to get the final estimation image of the source location. We test the proposed method on complex synthetic and field data set from the surface hydraulic fracturing monitoring, and compare the results with those obtained by interferometric CCM. The results demonstrate that the proposed method can obtain a 50 per cent higher spatial resolution image of the source location, and more robust estimation with smaller errors of the localization especially in the presence of velocity model errors. This method is also beneficial for source mechanism inversion and global seismology applications.

  15. Methods for monitoring hydroacoustic events using direct and reflected T waves in the Indian Ocean

    NASA Astrophysics Data System (ADS)

    Hanson, Jeffrey A.; Bowman, J. Roger

    2006-02-01

    The recent installation of permanent, three-element hydrophone arrays in the Indian Ocean offshore Diego Garcia and Cape Leeuwin, Australia, provides an opportunity to study hydroacoustic sources in more detail than previously possible. We developed and applied methods for coherent processing of the array data, for automated association of signals detected at more than one array, and for source location using only direct arrivals and using signals reflected from coastlines and other bathymetric features. During the 286-day study, 4725 hydroacoustic events were defined and located in the Indian and Southern oceans. Events fall into two classes: tectonic earthquakes and ice-related noise. The tectonic earthquakes consist of mid-ocean ridge, trench, and intraplate earthquakes. Mid-ocean ridge earthquakes are the most common tectonic events and often occur in clusters along transform offsets. Hydroacoustic signal levels for earthquakes in a standard catalog suggest that the hydroacoustic processing threshold for ridge events is one magnitude below the seismic network. Fewer earthquakes are observed along the Java Trench than expected because the large bathymetric relief of the source region complicates coupling between seismic and hydroacoustic signals, leading to divergent signal characteristics at different stations. We located 1843 events along the Antarctic coast resulting from various ice noises, most likely thermal fracturing and ice ridge forming events. Reflectors of signals from earthquakes are observed along coastlines, the mid-Indian Ocean and Ninety East ridges, and other bathymetric features. Reflected signals are used as synthetic stations to reduce location uncertainty and to enable event location with a single station.

  16. Passive (Micro-) Seismic Event Detection by Identifying Embedded "Event" Anomalies Within Statistically Describable Background Noise

    NASA Astrophysics Data System (ADS)

    Baziw, Erick; Verbeek, Gerald

    2012-12-01

    Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.

  17. Achieve Location Privacy-Preserving Range Query in Vehicular Sensing

    PubMed Central

    Lu, Rongxing; Ma, Maode; Bao, Haiyong

    2017-01-01

    Modern vehicles are equipped with a plethora of on-board sensors and large on-board storage, which enables them to gather and store various local-relevant data. However, the wide application of vehicular sensing has its own challenges, among which location-privacy preservation and data query accuracy are two critical problems. In this paper, we propose a novel range query scheme, which helps the data requester to accurately retrieve the sensed data from the distributive on-board storage in vehicular ad hoc networks (VANETs) with location privacy preservation. The proposed scheme exploits structured scalars to denote the locations of data requesters and vehicles, and achieves the privacy-preserving location matching with the homomorphic Paillier cryptosystem technique. Detailed security analysis shows that the proposed range query scheme can successfully preserve the location privacy of the involved data requesters and vehicles, and protect the confidentiality of the sensed data. In addition, performance evaluations are conducted to show the efficiency of the proposed scheme, in terms of computation delay and communication overhead. Specifically, the computation delay and communication overhead are not dependent on the length of the scalar, and they are only proportional to the number of vehicles. PMID:28786943

  18. Achieve Location Privacy-Preserving Range Query in Vehicular Sensing.

    PubMed

    Kong, Qinglei; Lu, Rongxing; Ma, Maode; Bao, Haiyong

    2017-08-08

    Modern vehicles are equipped with a plethora of on-board sensors and large on-board storage, which enables them to gather and store various local-relevant data. However, the wide application of vehicular sensing has its own challenges, among which location-privacy preservation and data query accuracy are two critical problems. In this paper, we propose a novel range query scheme, which helps the data requester to accurately retrieve the sensed data from the distributive on-board storage in vehicular ad hoc networks (VANETs) with location privacy preservation. The proposed scheme exploits structured scalars to denote the locations of data requesters and vehicles, and achieves the privacy-preserving location matching with the homomorphic Paillier cryptosystem technique. Detailed security analysis shows that the proposed range query scheme can successfully preserve the location privacy of the involved data requesters and vehicles, and protect the confidentiality of the sensed data. In addition, performance evaluations are conducted to show the efficiency of the proposed scheme, in terms of computation delay and communication overhead. Specifically, the computation delay and communication overhead are not dependent on the length of the scalar, and they are only proportional to the number of vehicles.

  19. Line-Constrained Camera Location Estimation in Multi-Image Stereomatching.

    PubMed

    Donné, Simon; Goossens, Bart; Philips, Wilfried

    2017-08-23

    Stereomatching is an effective way of acquiring dense depth information from a scene when active measurements are not possible. So-called lightfield methods take a snapshot from many camera locations along a defined trajectory (usually uniformly linear or on a regular grid-we will assume a linear trajectory) and use this information to compute accurate depth estimates. However, they require the locations for each of the snapshots to be known: the disparity of an object between images is related to both the distance of the camera to the object and the distance between the camera positions for both images. Existing solutions use sparse feature matching for camera location estimation. In this paper, we propose a novel method that uses dense correspondences to do the same, leveraging an existing depth estimation framework to also yield the camera locations along the line. We illustrate the effectiveness of the proposed technique for camera location estimation both visually for the rectification of epipolar plane images and quantitatively with its effect on the resulting depth estimation. Our proposed approach yields a valid alternative for sparse techniques, while still being executed in a reasonable time on a graphics card due to its highly parallelizable nature.

  20. Sem-Analysing Events: Towards a Cultural Pedagogy of Hope

    ERIC Educational Resources Information Center

    Semetsky, Inna

    2007-01-01

    This paper locates the concept of learning among real-life human experiences and events. Functioning as a sign, a meaningful event can be understood in terms of a cultural extra-linguistic "text." Reading and interpreting diverse cultural "texts" are equivalent to constructing and learning critical symbolic lessons embedded in a continuous process…

  1. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  2. Beyond Verb Meaning: Experimental Evidence for Incremental Processing of Semantic Roles and Event Structure.

    PubMed

    Philipp, Markus; Graf, Tim; Kretzschmar, Franziska; Primus, Beatrice

    2017-01-01

    We present an event-related potentials (ERP) study that addresses the question of how pieces of information pertaining to semantic roles and event structure interact with each other and with the verb's meaning. Specifically, our study investigates German verb-final clauses with verbs of motion such as fliegen 'fly' and schweben 'float, hover,' which are indeterminate with respect to agentivity and event structure. Agentivity was tested by manipulating the animacy of the subject noun phrase and event structure by selecting a goal adverbial, which makes the event telic, or a locative adverbial, which leads to an atelic reading. On the clause-initial subject, inanimates evoked an N400 effect vis-à-vis animates. On the adverbial phrase in the atelic (locative) condition, inanimates showed an N400 in comparison to animates. The telic (goal) condition exhibited a similar amplitude like the inanimate-atelic condition. Finally, at the verbal lexeme, the inanimate condition elicited an N400 effect against the animate condition in the telic (goal) contexts. In the atelic (locative) condition, items with animates evoked an N400 effect compared to inanimates. The combined set of findings suggest that clause-initial animacy is not sufficient for agent identification in German, which seems to be completed only at the verbal lexeme in our experiment. Here non-agents (inanimates) changing their location in a goal-directed way and agents (animates) lacking this property are dispreferred and this challenges the assumption that change of (locational) state is generally a defining characteristic of the patient role. Besides this main finding that sheds new light on role prototypicality, our data seem to indicate effects that, in our view, are related to complexity, i.e., minimality. Inanimate subjects or goal arguments increase processing costs since they have role or event structure restrictions that animate subjects or locative modifiers lack.

  3. Calibration Techniques for Accurate Measurements by Underwater Camera Systems

    PubMed Central

    Shortis, Mark

    2015-01-01

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems. PMID:26690172

  4. Microseismic imaging using Geometric-mean Reverse-Time Migration in Hydraulic Fracturing Monitoring

    NASA Astrophysics Data System (ADS)

    Yin, J.; Ng, R.; Nakata, N.

    2017-12-01

    Unconventional oil and gas exploration techniques such as hydraulic fracturing are associated with microseismic events related to the generation and development of fractures. For example, hydraulic fracturing, which is popular in Southern Oklahoma, produces earthquakes that are greater than magnitude 2.0. Finding the accurate locations, and mechanisms, of these events provides important information of local stress conditions, fracture distribution, hazard assessment, and economical impact. The accurate source location is also important to separate fracking-induced and wastewater disposal induced seismicity. Here, we implement a wavefield-based imaging method called Geometric-mean Reverse-Time Migration (GmRTM), which takes the advantage of accurate microseismic location based on wavefield back projection. We apply GmRTM to microseismic data collected during hydraulic fracturing for imaging microseismic source locations, and potentially, fractures. Assuming an accurate velocity model, GmRTM can improve the spatial resolution of source locations compared to HypoDD or P/S travel-time based methods. We will discuss the results from GmRTM and HypoDD using this field dataset and synthetic data.

  5. Lg-Wave Cross Correlation and Epicentral Double-Difference Location in and near China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaff, David P.; Richards, Paul G.; Slinkard, Megan

    In this paper, we perform epicentral relocations for a broad area using cross-correlation measurements made on Lg waves recorded at regional distances on a sparse station network. Using a two-step procedure (pairwise locations and cluster locations), we obtain final locations for 5623 events—3689 for all of China from 1985 to 2005 and 1934 for the Wenchuan area from May to August 2008. These high-quality locations comprise 20% of a starting catalog for all of China and 25% of a catalog for Wenchuan. Of the 1934 events located for Wenchuan, 1662 (86%) were newly detected. The final locations explain the residualsmore » 89 times better than the catalog locations for all of China (3.7302–0.0417 s) and 32 times better than the catalog locations for Wenchuan (0.8413–0.0267 s). The average semimajor axes of the 95% confidence ellipses are 420 m for all of China and 370 m for Wenchuan. The average azimuthal gaps are 205° for all of China and 266° for Wenchuan. 98% of the station distances for all of China are over 200 km. The mean and maximum station distances are 898 and 2174 km. The robustness of our location estimates and various trade-offs and sensitivities is explored with different inversion parameters for the location, such as starting locations for iterative solutions and which singular values to include. Finally, our results provide order-of-magnitude improvements in locations for event clusters, using waveforms from a very sparse far-regional network for which data are openly available.« less

  6. Lg-Wave Cross Correlation and Epicentral Double-Difference Location in and near China

    DOE PAGES

    Schaff, David P.; Richards, Paul G.; Slinkard, Megan; ...

    2018-03-20

    In this paper, we perform epicentral relocations for a broad area using cross-correlation measurements made on Lg waves recorded at regional distances on a sparse station network. Using a two-step procedure (pairwise locations and cluster locations), we obtain final locations for 5623 events—3689 for all of China from 1985 to 2005 and 1934 for the Wenchuan area from May to August 2008. These high-quality locations comprise 20% of a starting catalog for all of China and 25% of a catalog for Wenchuan. Of the 1934 events located for Wenchuan, 1662 (86%) were newly detected. The final locations explain the residualsmore » 89 times better than the catalog locations for all of China (3.7302–0.0417 s) and 32 times better than the catalog locations for Wenchuan (0.8413–0.0267 s). The average semimajor axes of the 95% confidence ellipses are 420 m for all of China and 370 m for Wenchuan. The average azimuthal gaps are 205° for all of China and 266° for Wenchuan. 98% of the station distances for all of China are over 200 km. The mean and maximum station distances are 898 and 2174 km. The robustness of our location estimates and various trade-offs and sensitivities is explored with different inversion parameters for the location, such as starting locations for iterative solutions and which singular values to include. Finally, our results provide order-of-magnitude improvements in locations for event clusters, using waveforms from a very sparse far-regional network for which data are openly available.« less

  7. Real-time locating systems (RTLS) in healthcare: a condensed primer

    PubMed Central

    2012-01-01

    Real-time locating systems (RTLS, also known as real-time location systems) have become an important component of many existing ubiquitous location aware systems. While GPS (global positioning system) has been quite successful as an outdoor real-time locating solution, it fails to repeat this success indoors. A number of RTLS technologies have been used to solve indoor tracking problems. The ability to accurately track the location of assets and individuals indoors has many applications in healthcare. This paper provides a condensed primer of RTLS in healthcare, briefly covering the many options and technologies that are involved, as well as the various possible applications of RTLS in healthcare facilities and their potential benefits, including capital expenditure reduction and workflow and patient throughput improvements. The key to a successful RTLS deployment lies in picking the right RTLS option(s) and solution(s) for the application(s) or problem(s) at hand. Where this application-technology match has not been carefully thought of, any technology will be doomed to failure or to achieving less than optimal results. PMID:22741760

  8. Real-time locating systems (RTLS) in healthcare: a condensed primer.

    PubMed

    Kamel Boulos, Maged N; Berry, Geoff

    2012-06-28

    Real-time locating systems (RTLS, also known as real-time location systems) have become an important component of many existing ubiquitous location aware systems. While GPS (global positioning system) has been quite successful as an outdoor real-time locating solution, it fails to repeat this success indoors. A number of RTLS technologies have been used to solve indoor tracking problems. The ability to accurately track the location of assets and individuals indoors has many applications in healthcare. This paper provides a condensed primer of RTLS in healthcare, briefly covering the many options and technologies that are involved, as well as the various possible applications of RTLS in healthcare facilities and their potential benefits, including capital expenditure reduction and workflow and patient throughput improvements. The key to a successful RTLS deployment lies in picking the right RTLS option(s) and solution(s) for the application(s) or problem(s) at hand. Where this application-technology match has not been carefully thought of, any technology will be doomed to failure or to achieving less than optimal results.

  9. 3D-Web-GIS RFID Location Sensing System for Construction Objects

    PubMed Central

    2013-01-01

    Construction site managers could benefit from being able to visualize on-site construction objects. Radio frequency identification (RFID) technology has been shown to improve the efficiency of construction object management. The objective of this study is to develop a 3D-Web-GIS RFID location sensing system for construction objects. An RFID 3D location sensing algorithm combining Simulated Annealing (SA) and a gradient descent method is proposed to determine target object location. In the algorithm, SA is used to stabilize the search process and the gradient descent method is used to reduce errors. The locations of the analyzed objects are visualized using the 3D-Web-GIS system. A real construction site is used to validate the applicability of the proposed method, with results indicating that the proposed approach can provide faster, more accurate, and more stable 3D positioning results than other location sensing algorithms. The proposed system allows construction managers to better understand worksite status, thus enhancing managerial efficiency. PMID:23864821

  10. 3D-Web-GIS RFID location sensing system for construction objects.

    PubMed

    Ko, Chien-Ho

    2013-01-01

    Construction site managers could benefit from being able to visualize on-site construction objects. Radio frequency identification (RFID) technology has been shown to improve the efficiency of construction object management. The objective of this study is to develop a 3D-Web-GIS RFID location sensing system for construction objects. An RFID 3D location sensing algorithm combining Simulated Annealing (SA) and a gradient descent method is proposed to determine target object location. In the algorithm, SA is used to stabilize the search process and the gradient descent method is used to reduce errors. The locations of the analyzed objects are visualized using the 3D-Web-GIS system. A real construction site is used to validate the applicability of the proposed method, with results indicating that the proposed approach can provide faster, more accurate, and more stable 3D positioning results than other location sensing algorithms. The proposed system allows construction managers to better understand worksite status, thus enhancing managerial efficiency.

  11. Adaptive learning compressive tracking based on Markov location prediction

    NASA Astrophysics Data System (ADS)

    Zhou, Xingyu; Fu, Dongmei; Yang, Tao; Shi, Yanan

    2017-03-01

    Object tracking is an interdisciplinary research topic in image processing, pattern recognition, and computer vision which has theoretical and practical application value in video surveillance, virtual reality, and automatic navigation. Compressive tracking (CT) has many advantages, such as efficiency and accuracy. However, when there are object occlusion, abrupt motion and blur, similar objects, and scale changing, the CT has the problem of tracking drift. We propose the Markov object location prediction to get the initial position of the object. Then CT is used to locate the object accurately, and the classifier parameter adaptive updating strategy is given based on the confidence map. At the same time according to the object location, extract the scale features, which is able to deal with object scale variations effectively. Experimental results show that the proposed algorithm has better tracking accuracy and robustness than current advanced algorithms and achieves real-time performance.

  12. Emergency Physicians as Good Samaritans: Survey of Frequency, Locations, Supplies and Medications.

    PubMed

    Burkholder, Taylor W; King, Renee A

    2016-01-01

    Little is known about the frequency and locations in which emergency physicians (EPs) are bystanders to an accident or emergency; equally uncertain is which contents of an "emergency kit" may be useful during such events. The aim of this study was to describe the frequency and locations of Good Samaritan acts by EPs and also determine which emergency kit supplies and medications were most commonly used by Good Samaritans. We conducted an electronic survey among a convenience sample of EPs in Colorado. Respondents reported a median frequency of 2.0 Good Samaritan acts per five years of practice, with the most common locations being sports and entertainment events (25%), road traffic accidents (21%), and wilderness settings (19%). Of those who had acted as Good Samaritans, 86% reported that at least one supply would have been useful during the most recent event, and 66% reported at least one medication would have been useful. The most useful supplies were gloves (54%), dressings (34%), and a stethoscope (20%), while the most useful medications were oxygen (19%), intravenous fluids (17%), and epinephrine (14%). The majority of EPs can expect to provide Good Samaritan care during their careers and would be better prepared by carrying a kit with common supplies and medications where they are most likely to use them.

  13. Relation Between Sprite Distribution and Source Locations of VHF Pulses Derived From JEM- GLIMS Measurements

    NASA Astrophysics Data System (ADS)

    Sato, Mitsuteru; Mihara, Masahiro; Ushio, Tomoo; Morimoto, Takeshi; Kikuchi, Hiroshi; Adachi, Toru; Suzuki, Makoto; Yamazaki, Atsushi; Takahashi, Yukihiro

    2015-04-01

    JEM-GLIMS is continuing the comprehensive nadir observations of lightning and TLEs using optical instruments and electromagnetic wave receivers since November 2012. For the period between November 20, 2012 and November 30, 2014, JEM-GLIMS succeeded in detecting 5,048 lightning events. A total of 567 events in 5,048 lightning events were TLEs, which were mostly elves events. To identify the sprite occurrences from the transient optical flash data, it is necessary to perform the following data analysis: (1) a subtraction of the appropriately scaled wideband camera data from the narrowband camera data; (2) a calculation of intensity ratio between different spectrophotometer channels; and (3) an estimation of the polarization and CMC for the parent CG discharges using ground-based ELF measurement data. From a synthetic comparison of these results, it is confirmed that JEM-GLISM succeeded in detecting sprite events. The VHF receiver (VITF) onboard JEM-GLIMS uses two patch-type antennas separated by a 1.6-m interval and can detect VHF pulses emitted by lightning discharges in the 70-100 MHz frequency range. Using both an interferometric technique and a group delay technique, we can estimate the source locations of VHF pulses excited by lightning discharges. In the event detected at 06:41:15.68565 UT on June 12, 2014 over central North America, sprite was distributed with a horizontal displacement of 20 km from the peak location of the parent lightning emission. In this event, a total of 180 VHF pulses were simultaneously detected by VITF. From the detailed data analysis of these VHF pulse data, it is found that the majority of the source locations were placed near the area of the dim lightning emission, which may imply that the VHF pulses were associated with the in-cloud lightning current. At the presentation, we will show detailed comparison between the spatiotemporal characteristics of sprite emission and source locations of VHF pulses excited by the parent lightning

  14. A multi-subject evaluation of uncertainty in anatomical landmark location on shoulder kinematic description.

    PubMed

    Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J

    2009-04-01

    An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.

  15. Single Event Effect Testing of the Analog Devices ADV212

    NASA Technical Reports Server (NTRS)

    Wilcox, Ted; Campola, Michael; Kadari, Madhu; Nadendla, Seshagiri R.

    2017-01-01

    The Analog Devices ADV212 was initially tested for single event effects (SEE) at the Texas AM University Cyclotron Facility (TAMU) in July of 2013. Testing revealed a sensitivity to device hang-ups classified as single event functional interrupts (SEFI), soft data errors classified as single event upsets (SEU), and, of particular concern, single event latch-ups (SEL). All error types occurred so frequently as to make accurate measurements of the exposure time, and thus total particle fluence, challenging. To mitigate some of the risk posed by single event latch-ups, circuitry was added to the electrical design to detect a high current event and automatically recycle power and reboot the device. An additional heavy-ion test was scheduled to validate the operation of the recovery circuitry and the continuing functionality of the ADV212 after a substantial number of latch-up events. As a secondary goal, more precise data would be gathered by an improved test method, described in this test report.

  16. 49 CFR 37.91 - Wheelchair locations and food service on intercity rail trains.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Wheelchair locations and food service on intercity... Entities § 37.91 Wheelchair locations and food service on intercity rail trains. (a) As soon as practicable, but in no event later than July 26, 1995, each person providing intercity rail service shall provide...

  17. Physical Therapists Make Accurate and Appropriate Discharge Recommendations for Patients Who Are Acutely Ill

    PubMed Central

    Fields, Christina J.; Fernandez, Natalia

    2010-01-01

    Background Acute care physical therapists contribute to the complex process of patient discharge planning. As physical therapists are experts at evaluating functional abilities and are able to incorporate various other factors relevant to discharge planning, it was expected that physical therapists’ recommendations of patient discharge location would be both accurate and appropriate. Objective This study determined how often the therapists’ recommendations for patient discharge location and services were implemented, representing the accuracy of the recommendations. The impact of unimplemented recommendations on readmission rate was examined, reflecting the appropriateness of the recommendations. Design This retrospective study included the discharge recommendations of 40 acute care physical therapists for 762 patients in a large academic medical center. The frequency of mismatch between the physical therapist's recommendation and the patient's actual discharge location and services was calculated. The mismatch variable had 3 levels: match, mismatch with services lacking, or mismatch with different services. Regression analysis was used to test whether mismatch status, patient age, length of admission, or discharge location predicted patient readmittance. Results Overall, physical therapists’ discharge recommendations were implemented 83% of the time. Patients were 2.9 times more likely to be readmitted when the therapist's discharge recommendation was not implemented and recommended follow-up services were lacking (mismatch with services lacking) compared with patients with a match. Limitations This study was limited to one facility. Limited information about the patients was collected, and data on patient readmission to other facilities were not collected. Conclusions This study supports the role of physical therapists in discharge planning in the acute care setting. Physical therapists demonstrated the ability to make accurate and appropriate discharge

  18. Swarms of small volcano-tectonic events preceding paroxysmal explosions of Tungurahua volcano (Ecuador)

    NASA Astrophysics Data System (ADS)

    Battaglia, J.; Hidalgo, S.; Douchain, J. M.; Pacheco, D. A.; Cordova, J.; Alvarado, A. P.; Parra, R.

    2017-12-01

    Tungurahua (5023 m a.s.l.) is an andesitic volcano located in Central Ecuador. It has been erupting since September 1999. It's activity transitioned in late 2008 towards the occurrence of distinct eruptive phases separated by periods of quiescence. These phases display a great variability of eruptive patterns. In particular the onsets of these phases are quite variable, ranging from progressive increase of surface activity to violent paroxysmal explosions eventually generating pyroclastic flows and plumes up to 13.000 m elevation. The volcano is monitored by the Instituto Geofisico in Quito whose permanent monitoring network include 6 broadband and 6 short period stations. These instruments record various signals related to eruptive processes as well as Long Period and volcano-tectonique (VT) events. However, most of the VT events are scattered around the volcano at depths up to 5-10 km b.s.l.. Their relationship with eruptive activity and precursory aspect are unclear. Since October 2013, we operate a temporary network of 13 broadband stations located up to 4275 m a.s.l., including on the Eastern flank which is remote. We examined data from a reference station located near the summit (3900 m a.s.l.) with a detection and classification procedure, searching for families of similar events. This processing enlights the presence of several families of small VTs previously poorly identified. We located manually some of these events and proceeded with similarity picking using cross-correlation and waveform similarity for nearly 400 events. Finally we applied precise relocation techniques. These events are located 2-3 km below the summit and define vertically elongated streaks. Their temporal evolution shows that they occur in swarms during the days or hours preceding the paroxysmal vent opening explosions in February and April 2014. These short-term precursors could indicate the rupturing of a barrier prior to the large explosions of Tungurahua.

  19. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling

    PubMed Central

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of

  20. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    PubMed

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of

  1. Real-time Accurate Surface Reconstruction Pipeline for Vision Guided Planetary Exploration Using Unmanned Ground and Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    Almeida, Eduardo DeBrito

    2012-01-01

    This report discusses work completed over the summer at the Jet Propulsion Laboratory (JPL), California Institute of Technology. A system is presented to guide ground or aerial unmanned robots using computer vision. The system performs accurate camera calibration, camera pose refinement and surface extraction from images collected by a camera mounted on the vehicle. The application motivating the research is planetary exploration and the vehicles are typically rovers or unmanned aerial vehicles. The information extracted from imagery is used primarily for navigation, as robot location is the same as the camera location and the surfaces represent the terrain that rovers traverse. The processed information must be very accurate and acquired very fast in order to be useful in practice. The main challenge being addressed by this project is to achieve high estimation accuracy and high computation speed simultaneously, a difficult task due to many technical reasons.

  2. How accurately can other people infer your thoughts-And does culture matter?

    PubMed

    Valanides, Constantinos; Sheppard, Elizabeth; Mitchell, Peter

    2017-01-01

    This research investigated how accurately people infer what others are thinking after observing a brief sample of their behaviour and whether culture/similarity is a relevant factor. Target participants (14 British and 14 Mediterraneans) were cued to think about either positive or negative events they had experienced. Subsequently, perceiver participants (16 British and 16 Mediterraneans) watched videos of the targets thinking about these things. Perceivers (both groups) were significantly accurate in judging when targets had been cued to think of something positive versus something negative, indicating notable inferential ability. Additionally, Mediterranean perceivers were better than British perceivers in making such inferences, irrespective of nationality of the targets, something that was statistically accounted for by corresponding group differences in levels of independently measured collectivism. The results point to the need for further research to investigate the possibility that being reared in a collectivist culture fosters ability in interpreting others' behaviour.

  3. How accurately can other people infer your thoughts—And does culture matter?

    PubMed Central

    Valanides, Constantinos; Sheppard, Elizabeth; Mitchell, Peter

    2017-01-01

    This research investigated how accurately people infer what others are thinking after observing a brief sample of their behaviour and whether culture/similarity is a relevant factor. Target participants (14 British and 14 Mediterraneans) were cued to think about either positive or negative events they had experienced. Subsequently, perceiver participants (16 British and 16 Mediterraneans) watched videos of the targets thinking about these things. Perceivers (both groups) were significantly accurate in judging when targets had been cued to think of something positive versus something negative, indicating notable inferential ability. Additionally, Mediterranean perceivers were better than British perceivers in making such inferences, irrespective of nationality of the targets, something that was statistically accounted for by corresponding group differences in levels of independently measured collectivism. The results point to the need for further research to investigate the possibility that being reared in a collectivist culture fosters ability in interpreting others’ behaviour. PMID:29112972

  4. Text Content Pushing Technology Research Based on Location and Topic

    NASA Astrophysics Data System (ADS)

    Wei, Dongqi; Wei, Jianxin; Wumuti, Naheman; Jiang, Baode

    2016-11-01

    In the field, geological workers usually want to obtain related geological background information in the working area quickly and accurately. This information exists in the massive geological data, text data is described in natural language accounted for a large proportion. This paper studied location information extracting method in the mass text data; proposed a geographic location—geological content—geological content related algorithm based on Spark and Mapreduce2, finally classified content by using KNN, and built the content pushing system based on location and topic. It is running in the geological survey cloud, and we have gained a good effect in testing by using real geological data.

  5. Incidental retrieval-induced forgetting of location information.

    PubMed

    Gómez-Ariza, Carlos J; Fernandez, Angel; Bajo, M Teresa

    2012-06-01

    Retrieval-induced forgetting (RIF) has been studied with different types of tests and materials. However, RIF has always been tested on the items' central features, and there is no information on whether inhibition also extends to peripheral features of the events in which the items are embedded. In two experiments, we specifically tested the presence of RIF in a task in which recall of peripheral information was required. After a standard retrieval practice task oriented to item identity, participants were cued with colors (Exp. 1) or with the items themselves (Exp. 2) and asked to recall the screen locations where the items had been displayed during the study phase. RIF for locations was observed after retrieval practice, an effect that was not present when participants were asked to read instead of retrieving the items. Our findings provide evidence that peripheral location information associated with an item during study can be also inhibited when the retrieval conditions promote the inhibition of more central, item identity information.

  6. Bayesian analysis of rare events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less

  7. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  8. Out with the new, in with the old: Exogenous orienting to locations with physically constant stimulation.

    PubMed

    Taylor, J Eric T; Hilchey, Matthew D; Pratt, Jay

    2018-01-24

    Dominant methods of investigating exogenous orienting presume that attention is captured most effectively at locations containing new events. This is evidenced by the ubiquitous use of transient stimuli as cues in the literature on exogenous orienting. In the present study, we showed that attention can be oriented exogenously toward a location containing a completely unchanging stimulus by modifying Posner's landmark exogenous spatial-cueing paradigm. Observers searched a six-element array of placeholder stimuli for an onset target. The target was preceded by a decrement in luminance to five of the six placeholders, such that one location remained physically constant. This "nonset" stimulus (so named to distinguish it from a traditional onsetting transient) acted as an exogenous cue, eliciting patterns of facilitation and inhibition at the nonset location and demonstrating that exogenous orienting is not always evident at the location of a visual transient. This method eliminates the decades-long confounding of orienting to a location with the processing of new events at that location, permitting alternative considerations of the nature of attentional selection.

  9. Accurate high-throughput structure mapping and prediction with transition metal ion FRET

    PubMed Central

    Yu, Xiaozhen; Wu, Xiongwu; Bermejo, Guillermo A.; Brooks, Bernard R.; Taraska, Justin W.

    2013-01-01

    Mapping the landscape of a protein’s conformational space is essential to understanding its functions and regulation. The limitations of many structural methods have made this process challenging for most proteins. Here, we report that transition metal ion FRET (tmFRET) can be used in a rapid, highly parallel screen, to determine distances from multiple locations within a protein at extremely low concentrations. The distances generated through this screen for the protein Maltose Binding Protein (MBP) match distances from the crystal structure to within a few angstroms. Furthermore, energy transfer accurately detects structural changes during ligand binding. Finally, fluorescence-derived distances can be used to guide molecular simulations to find low energy states. Our results open the door to rapid, accurate mapping and prediction of protein structures at low concentrations, in large complex systems, and in living cells. PMID:23273426

  10. Single event burnout sensitivity of embedded field effect transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koga, R.; Crain, S.H.; Crawford, K.B.

    Observations of single event burnout (SEB) in embedded field effect transistors are reported. Both SEB and other single event effects are presented for several pulse width modulation and high frequency devices. The microscope has been employed to locate and to investigate the damaged areas. A model of the damage mechanism based on the results so obtained is described.

  11. Single event burnout sensitivity of embedded field effect transistors

    NASA Astrophysics Data System (ADS)

    Koga, R.; Crain, S. H.; Crawford, K. B.; Yu, P.; Gordon, M. J.

    1999-12-01

    Observations of single event burnout (SEB) in embedded field effect transistors are reported. Both SEB and other single event effects are presented for several pulse width modulation and high frequency devices. The microscope has been employed to locate and to investigate the damaged areas. A model of the damage mechanism based on the results so obtained is described.

  12. Locating dayside magnetopause reconnection with exhaust ion distributions

    NASA Astrophysics Data System (ADS)

    Broll, J. M.; Fuselier, S. A.; Trattner, K. J.

    2017-05-01

    Magnetic reconnection at Earth's dayside magnetopause is essential to magnetospheric dynamics. Determining where reconnection takes place is important to understanding the processes involved, and many questions about reconnection location remain unanswered. We present a method for locating the magnetic reconnection X line at Earth's dayside magnetopause under southward interplanetary magnetic field conditions using only ion velocity distribution measurements. Particle-in-cell simulations based on Cluster magnetopause crossings produce ion velocity distributions that we propagate through a model magnetosphere, allowing us to calculate the field-aligned distance between an exhaust observation and its associated reconnection line. We demonstrate this procedure for two events and compare our results with those of the Maximum Magnetic Shear Model; we find good agreement with its results and show that when our method is applicable, it produces more precise locations than the Maximum Shear Model.

  13. High-speed event detector for embedded nanopore bio-systems.

    PubMed

    Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie

    2015-08-01

    Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods.

  14. Emergency Locator Transmitter Crash Testing

    NASA Image and Video Library

    2015-07-29

    Drop-testing a series of three Cessna 172 aircraft, NASA simulated severe but survivable plane accidents on July 2, July 29 and August 26, 2015, to test emergency locator transmitters (ELTs). A research team equipped the vintage airplanes with five ELTs, two crash test dummies, cameras and data-collecting sensors. ELTs are installed on general aviation and commercial planes to transmit a location signal in the event of a crash. Current ELT models send that signal to orbiting satellites, which repeat it to the nearest search and rescue ground station. The signal is used to determine and transmit the ELT's identity and location to rescuers. ELTs have to work in the extreme circumstances involved in an airplane crash. Included in those extreme circumstances are the possibilities of excessive vibration, fire and impact damage. NASA research is designed to find practical ways to improve ELT system performance and robustness, giving rescue workers the best chance of saving lives. The research was funded by the Search and Rescue Mission Office at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The testing took place at NASA’s Langley Research Center in Hampton, Virginia. This is a video of the July 29, 2015, test.

  15. GRMHD Simulations of Visibility Amplitude Variability for Event Horizon Telescope Images of Sgr A*

    NASA Astrophysics Data System (ADS)

    Medeiros, Lia; Chan, Chi-kwan; Özel, Feryal; Psaltis, Dimitrios; Kim, Junhan; Marrone, Daniel P.; Sa¸dowski, Aleksander

    2018-04-01

    The Event Horizon Telescope will generate horizon scale images of the black hole in the center of the Milky Way, Sgr A*. Image reconstruction using interferometric visibilities rests on the assumption of a stationary image. We explore the limitations of this assumption using high-cadence disk- and jet-dominated GRMHD simulations of Sgr A*. We also employ analytic models that capture the basic characteristics of the images to understand the origin of the variability in the simulated visibility amplitudes. We find that, in all simulations, the visibility amplitudes for baselines oriented parallel and perpendicular to the spin axis of the black hole follow general trends that do not depend strongly on accretion-flow properties. This suggests that fitting Event Horizon Telescope observations with simple geometric models may lead to a reasonably accurate determination of the orientation of the black hole on the plane of the sky. However, in the disk-dominated models, the locations and depths of the minima in the visibility amplitudes are highly variable and are not related simply to the size of the black hole shadow. This suggests that using time-independent models to infer additional black hole parameters, such as the shadow size or the spin magnitude, will be severely affected by the variability of the accretion flow.

  16. The Importance of "What": Infants Use Featural Information to Index Events

    ERIC Educational Resources Information Center

    Kirkham, Natasha Z.; Richardson, Daniel C.; Wu, Rachel; Johnson, Scott P.

    2012-01-01

    Dynamic spatial indexing is the ability to encode, remember, and track the location of complex events. For example, in a previous study, 6-month-old infants were familiarized to a toy making a particular sound in a particular location, and later they fixated that empty location when they heard the sound presented alone ("Journal of Experimental…

  17. An FBG acoustic emission source locating system based on PHAT and GA

    NASA Astrophysics Data System (ADS)

    Shen, Jing-shi; Zeng, Xiao-dong; Li, Wei; Jiang, Ming-shun

    2017-09-01

    Using the acoustic emission locating technology to monitor the health of the structure is important for ensuring the continuous and healthy operation of the complex engineering structures and large mechanical equipment. In this paper, four fiber Bragg grating (FBG) sensors are used to establish the sensor array to locate the acoustic emission source. Firstly, the nonlinear locating equations are established based on the principle of acoustic emission, and the solution of these equations is transformed into an optimization problem. Secondly, time difference extraction algorithm based on the phase transform (PHAT) weighted generalized cross correlation provides the necessary conditions for the accurate localization. Finally, the genetic algorithm (GA) is used to solve the optimization model. In this paper, twenty points are tested in the marble plate surface, and the results show that the absolute locating error is within the range of 10 mm, which proves the accuracy of this locating method.

  18. Event-based user classification in Weibo media.

    PubMed

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  19. HFT events - Shallow moonquakes. [High-Frequency Teleseismic

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.

    1977-01-01

    A few large distant seismic events of distinctly high signal frequency, designated HFT (high-frequency teleseismic) events, are observed yearly by the Apollo lunar seismic network. Their sources are located on or near the surface of the moon, leaving a large gap in seismic activity between the zones of HFT sources and deep moonquakes. No strong regularities are found in either their spatial or temporal distributions. Several working hypotheses for the identity of these sources have advanced, but many characteristics of the events seem to favor a hypothesis that they are shallow moonquakes. Simultaneous observations of other lunar phenomena may eventually enable the determination of their true identity.

  20. A probabilistic framework for single-station location of seismicity on Earth and Mars

    NASA Astrophysics Data System (ADS)

    Böse, M.; Clinton, J. F.; Ceylan, S.; Euchner, F.; van Driel, M.; Khan, A.; Giardini, D.; Lognonné, P.; Banerdt, W. B.

    2017-01-01

    Locating the source of seismic energy from a single three-component seismic station is associated with large uncertainties, originating from challenges in identifying seismic phases, as well as inevitable pick and model uncertainties. The challenge is even higher for planets such as Mars, where interior structure is a priori largely unknown. In this study, we address the single-station location problem by developing a probabilistic framework that combines location estimates from multiple algorithms to estimate the probability density function (PDF) for epicentral distance, back azimuth, and origin time. Each algorithm uses independent and complementary information in the seismic signals. Together, the algorithms allow locating seismicity ranging from local to teleseismic quakes. Distances and origin times of large regional and teleseismic events (M > 5.5) are estimated from observed and theoretical body- and multi-orbit surface-wave travel times. The latter are picked from the maxima in the waveform envelopes in various frequency bands. For smaller events at local and regional distances, only first arrival picks of body waves are used, possibly in combination with fundamental Rayleigh R1 waveform maxima where detectable; depth phases, such as pP or PmP, help constrain source depth and improve distance estimates. Back azimuth is determined from the polarization of the Rayleigh- and/or P-wave phases. When seismic signals are good enough for multiple approaches to be used, estimates from the various methods are combined through the product of their PDFs, resulting in an improved event location and reduced uncertainty range estimate compared to the results obtained from each algorithm independently. To verify our approach, we use both earthquake recordings from existing Earth stations and synthetic Martian seismograms. The Mars synthetics are generated with a full-waveform scheme (AxiSEM) using spherically-symmetric seismic velocity, density and attenuation models of

  1. Greedy Sparse Approaches for Homological Coverage in Location Unaware Sensor Networks

    DTIC Science & Technology

    2017-12-08

    GlobalSIP); 2013 Dec; Austin , TX . p. 595– 598. 33. Farah C, Schwaner F, Abedi A, Worboys M. Distributed homology algorithm to detect topological events...ARL-TR-8235•DEC 2017 US Army Research Laboratory Greedy Sparse Approaches for Homological Coverage in Location-Unaware Sensor Net- works by Terrence...8235•DEC 2017 US Army Research Laboratory Greedy Sparse Approaches for Homological Coverage in Location-Unaware Sensor Net- works by Terrence J Moore

  2. Using PVDF to locate the debris cloud impact position

    NASA Astrophysics Data System (ADS)

    Pang, Baojun; Liu, Zhidong

    2010-03-01

    With the increase of space activities, space debris environment has deteriorated. Space debris impact shields of spacecraft creates debris cloud, the debris cloud is a threat to module wall. In order to conduct an assessment of spacecraft module wall damage impacted by debris cloud, the damage position must be known. In order to design a light weight location system, polyvinylidene fluoride (PVDF) has been studied. Hyper-velocity impact experiments were conducted using two-stage light gas gun, the experimental results indicate that: the virtual wave front location method can be extended to debris cloud impact location, PVDF can be used to locate the damage position effectively, the signals gathered by PVDF from debris cloud impact contain more high frequency components than the signals created by single projectile impact event. The results provide a reference for the development of the sensor systems to detect impacts on spacecraft.

  3. Predicting Tree Mortality Die-off Events Associated with Hotter Drought and Assessing Their Global Consequences via Ecoclimate Teleconnections.

    NASA Astrophysics Data System (ADS)

    Breshears, D. D.; Allen, C. D.; McDowell, N. G.; Adams, H. D.; Barnes, M.; Barron-Gafford, G.; Bradford, J. B.; Cobb, N.; Field, J. P.; Froend, R.; Fontaine, J. B.; Garcia, E.; Hardy, G. E. S. J.; Huxman, T. E.; Kala, J.; Lague, M. M.; Martinez-Yrizar, A.; Matusick, G.; Minor, D. M.; Moore, D. J.; Ng, M.; Ruthrof, K. X.; Saleska, S. R.; Stark, S. C.; Swann, A. L. S.; Villegas, J. C.; Williams, A. P.; Zou, C.

    2017-12-01

    Evidence that tree mortality is increasingly likely occur in extensive die-off events across the terrestrial biosphere continues to mount. The consequences of such extensive mortality events are potentially profound, not only for the locations where die-off events occur, but also for other locations that could be impacted via ecoclimate teleconnections, whereby the land surface changes associated with die-off in one location could alter atmospheric circulation patterns and affect vegetation elsewhere. Here, we (1) recap the background of tree mortality as an emerging environmental issue, (2) highlight recent advances that could help us improve predictions of the vulnerability to tree mortality, including the underlying importance of hydraulic failure, the potential to develop climatic envelopes specific to tree mortality events, and consideration of the role of heat waves; and (3) initial bounding simulations that indicate the potential for tree die-off events in different locations to alter ecoclimate teleconnections. As we move toward globally coordinated carbon accounting and management, the high vulnerability to tree die-off events and the potential for such events to affect vegetation elsewhere will both need to be accounted for.

  4. Efficient Processing of Data for Locating Lightning Strikes

    NASA Technical Reports Server (NTRS)

    Medelius, Pedro J.; Starr, Stan

    2003-01-01

    Two algorithms have been devised to increase the efficiency of processing of data in lightning detection and ranging (LDAR) systems so as to enable the accurate location of lightning strikes in real time. In LDAR, the location of a lightning strike is calculated by solving equations for the differences among the times of arrival (DTOAs) of the lightning signals at multiple antennas as functions of the locations of the antennas and the speed of light. The most difficult part of the problem is computing the DTOAs from digitized versions of the signals received by the various antennas. One way (a time-domain approach) to determine the DTOAs is to compute cross-correlations among variously differentially delayed replicas of the digitized signals and to select, as the DTOAs, those differential delays that yield the maximum correlations. Another way (a frequency-domain approach) to determine the DTOAs involves the computation of cross-correlations among Fourier transforms of variously differentially phased replicas of the digitized signals, along with utilization of the relationship among phase difference, time delay, and frequency.

  5. Developing a disease outbreak event corpus.

    PubMed

    Conway, Mike; Kawazoe, Ai; Chanlekha, Hutchatai; Collier, Nigel

    2010-09-28

    In recent years, there has been a growth in work on the use of information extraction technologies for tracking disease outbreaks from online news texts, yet publicly available evaluation standards (and associated resources) for this new area of research have been noticeably lacking. This study seeks to create a "gold standard" data set against which to test how accurately disease outbreak information extraction systems can identify the semantics of disease outbreak events. Additionally, we hope that the provision of an annotation scheme (and associated corpus) to the community will encourage open evaluation in this new and growing application area. We developed an annotation scheme for identifying infectious disease outbreak events in news texts. An event--in the context of our annotation scheme--consists minimally of geographical (eg, country and province) and disease name information. However, the scheme also allows for the rich encoding of other domain salient concepts (eg, international travel, species, and food contamination). The work resulted in a 200-document corpus of event-annotated disease outbreak reports that can be used to evaluate the accuracy of event detection algorithms (in this case, for the BioCaster biosurveillance online news information extraction system). In the 200 documents, 394 distinct events were identified (mean 1.97 events per document, range 0-25 events per document). We also provide a download script and graphical user interface (GUI)-based event browsing software to facilitate corpus exploration. In summary, we present an annotation scheme and corpus that can be used in the evaluation of disease outbreak event extraction algorithms. The annotation scheme and corpus were designed both with the particular evaluation requirements of the BioCaster system in mind as well as the wider need for further evaluation resources in this growing research area.

  6. The 7.9 Denali Fault Earthquake: Aftershock Locations, Moment Tensors and Focal Mechanisms from the Regional Seismic Network Data

    NASA Astrophysics Data System (ADS)

    Ratchkovski, N. A.; Hansen, R. A.; Christensen, D.; Kore, K.

    2002-12-01

    The largest earthquake ever recorded on the Denali fault system (magnitude 7.9) struck central Alaska on November 3, 2002. It was preceded by a magnitude 6.7 foreshock on October 23. This earlier earthquake and its zone of aftershocks were located slightly to the west of the 7.9 quake. Aftershock locations and surface slip observations from the 7.9 quake indicate that the rupture was predominately unilateral in the eastward direction. Near Mentasta Lake, a village that experienced some of the worst damage in the quake, the surface rupture scar turns from the Denali fault to the adjacent Totschunda fault, which trends toward more southeasterly toward the Canadian border. Overall, the geologists found that measurable scarps indicate that the north side of the Denali fault moved to the east and vertically up relative to the south. Maximum offsets on the Denali fault were 8.8 meters at the Tok Highway cutoff, and were 2.2 meters on the Totschunda fault. The Alaska regional seismic network consists of over 250 station sites, operated by the Alaska Earthquake Information Center (AEIC), the Alaska Volcano Observatory (AVO), and the Pacific Tsunami Warning Center (PTWC). Over 25 sites are equipped with the broad-band sensors, some of which have in addition the strong motion sensors. The rest of the stations are either 1 or 3-component short-period instruments. The data from these stations are collected, processed and archived at the AEIC. The AEIC staff installed a temporary network with over 20 instruments following the 6.7 Nenana Mountain and the 7.9 events. Prior to the M 7.9 Denali Fault event, the automatic earthquake detection system at AEIC was locating between 15 and 30 events per day. After the event, the system had over 200-400 automatic locations per day for at least 10 days following the 7.9 event. The processing of the data is ongoing with the priority given to the larger events. The cumulative length of the 6.7 and 7.9 aftershock locations along the Denali

  7. Forecasting rain events - Meteorological models or collective intelligence?

    NASA Astrophysics Data System (ADS)

    Arazy, Ofer; Halfon, Noam; Malkinson, Dan

    2015-04-01

    Collective intelligence is shared (or group) intelligence that emerges from the collective efforts of many individuals. Collective intelligence is the aggregate of individual contributions: from simple collective decision making to more sophisticated aggregations such as in crowdsourcing and peer-production systems. In particular, collective intelligence could be used in making predictions about future events, for example by using prediction markets to forecast election results, stock prices, or the outcomes of sport events. To date, there is little research regarding the use of collective intelligence for prediction of weather forecasting. The objective of this study is to investigate the extent to which collective intelligence could be utilized to accurately predict weather events, and in particular rainfall. Our analyses employ metrics of group intelligence, as well as compare the accuracy of groups' predictions against the predictions of the standard model used by the National Meteorological Services. We report on preliminary results from a study conducted over the 2013-2014 and 2014-2015 winters. We have built a web site that allows people to make predictions on precipitation levels on certain locations. During each competition participants were allowed to enter their precipitation forecasts (i.e. 'bets') at three locations and these locations changed between competitions. A precipitation competition was defined as a 48-96 hour period (depending on the expected weather conditions), bets were open 24-48 hours prior to the competition, and during betting period participants were allowed to change their bets with no limitation. In order to explore the effect of transparency, betting mechanisms varied across study's sites: full transparency (participants able to see each other's bets); partial transparency (participants see the group's average bet); and no transparency (no information of others' bets is made available). Several interesting findings emerged from

  8. The 16 August 1997 Novaya Zemlya seismic event as viewed from GSN stations KEV and KBS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartse, H.E.

    1997-11-01

    Using current and historic seismic records from Global Seismic Network stations KEV and KBS, the authors find that S minus P arrival time comparisons between nuclear explosions and the 16 August 1997 seismic event (m{sub b} {approx} 3.6) from near Novaya Zemlya clearly indicate that (relative to KEV) the 16 August event occurred at least 80 km east of the Russian test site. Including S minus P arrival times from KBS constrains the location to beneath the Kara Sea and in good agreement with previously reported locations, over 100 km southeast of the test site. From an analysis of P{submore » n}/S{sub n} waveform ratios at frequencies above 4 Hz, they find that the 16 August event falls within the population of regional earthquakes and is distinctly separated from Novaya Zemlya and other northern Eurasian nuclear explosion populations. Thus, given its location and waveform characteristics, they conclude the 16 August event was an earthquake. The 16 August event was not detected at teleseismic distances, and thus, this event provides a good example of the regional detection, location, and identification efforts that will be required to monitor the Comprehensive Test Ban Treaty below m{sub b} {approx} 4.« less

  9. Microseismic Image-domain Velocity Inversion: Case Study From The Marcellus Shale

    NASA Astrophysics Data System (ADS)

    Shragge, J.; Witten, B.

    2017-12-01

    Seismic monitoring at injection wells relies on generating accurate location estimates of detected (micro-)seismicity. Event location estimates assist in optimizing well and stage spacings, assessing potential hazards, and establishing causation of larger events. The largest impediment to generating accurate location estimates is an accurate velocity model. For surface-based monitoring the model should capture 3D velocity variation, yet, rarely is the laterally heterogeneous nature of the velocity field captured. Another complication for surface monitoring is that the data often suffer from low signal-to-noise levels, making velocity updating with established techniques difficult due to uncertainties in the arrival picks. We use surface-monitored field data to demonstrate that a new method requiring no arrival picking can improve microseismic locations by jointly locating events and updating 3D P- and S-wave velocity models through image-domain adjoint-state tomography. This approach creates a complementary set of images for each chosen event through wave-equation propagation and correlating combinations of P- and S-wavefield energy. The method updates the velocity models to optimize the focal consistency of the images through adjoint-state inversions. We demonstrate the functionality of the method using a surface array of 192 three-component geophones over a hydraulic stimulation in the Marcellus Shale. Applying the proposed joint location and velocity-inversion approach significantly improves the estimated locations. To assess event location accuracy, we propose a new measure of inconsistency derived from the complementary images. By this measure the location inconsistency decreases by 75%. The method has implications for improving the reliability of microseismic interpretation with low signal-to-noise data, which may increase hydrocarbon extraction efficiency and improve risk assessment from injection related seismicity.

  10. Event Locations in Extra-Familial Child Sexual Molestation Cases: The Istanbul Example.

    PubMed

    Gönültaş, Burak M; Sahin, Bahadir

    2018-04-01

    A great deal of attention has been devoted to sexual molestation cases, both in theory and in practice. Child molesters are versatile and are not easily identified. Various theories and tactics, the most contemporary of which is environmental criminology, have been developed to find those criminals. Locations of victims, crime scenes, and distances among them as well as other situational variables are used to predict possible future offences in environmental criminology. This study applies the theory to sexual molestation crimes in Istanbul. Dependent distance variables are found to be correlated with several situational variables in a selected sample of 127 extra-familial child sexual molestation cases.

  11. A climatology of extreme wave height events impacting eastern Lake Ontario shorelines

    NASA Astrophysics Data System (ADS)

    Grieco, Matthew B.; DeGaetano, Arthur T.

    2018-05-01

    Model-derived wave height data for points along the eastern Lake Ontario shoreline provide the basis for a 36-year climatology of extreme wave heights. The most extreme wave heights exceed 6 m at all locations, except for those along the extreme northeastern shoreline of the Lake. Typically extreme wave events are a regional phenomenon, affecting multiple locations along the eastern and southeastern shoreline. A pronounced seasonal cycle in wave event occurrence is characterized by peaks in autumn and spring, with an absence of 99.9th percentile wave heights during summer. Less extreme (90th percentile heights) occur in all months with a peak in winter. Extreme wave events are most often associated with a low pressure center tracking to the north of Lake Ontario from the Ohio Valley. This track produces the strong winds > 10 ms-1 and predominantly west-to-east wind fetch that characterize high wave height events. The seasonal frequency of the wave events exceeding the historical 95th percentile has shown a statistically significant increase at most locations since 1979. This has been partially offset by declines in the frequency of events with wave heights between the 90 and 95th percentile. Seasonal extreme wave height frequency is also found to be related to the occurrence of El Niño. During El Niño winters, there are significantly fewer events with wave heights exceeding 2.5 m than would be expected by chance. A corresponding relationship to La Niña occurrence is not evident.

  12. State of the Universe of Astronomy on Tap Public Outreach Events

    NASA Astrophysics Data System (ADS)

    Rice, Emily; Constellation of Astronomy on Tap Host Stars

    2018-01-01

    Astronomy on Tap (AoT, http://astronomyontap.org) is a series of free public outreach events featuring engaging science presentations combined with music, games, and prizes in a fun, interactive atmosphere. AoT events feature one or more presentations given primarily by local professional scientists and graduate students, but also by visiting scientists, undergraduate students, educators, amateur astronomers, writers, artists, and other astronomy enthusiasts. Events are held at social venues like bars, coffee shops, and art galleries in order to bring science, the stories behind the research, and updates on the latest astronomy news directly to the public in a relaxed, informal atmosphere. Since the first New York City event in April 2013, nearly 400 AoT-affiliated events have been held in over 30 locations worldwide and the expansion is accelerating. The casual, social nature of AoT events provides important professional development opportunities in networking and in science communication, which we describe in a separate poster. The flexible format and content of a typical AoT event is easy to adapt and expand based on the priorities, resources, and interests of local organizers. We present the 2017 launches, including the first events in Europe and the first events conducted in French and Spanish, summarize the Universe of ongoing AoT events, and share recommendations for launching new satellite locations, also described in detail in our “Launch Manifesto” available upon request.

  13. Detection and location of small aftershocks using waveform cross correlation

    NASA Astrophysics Data System (ADS)

    Kitov, Ivan; Sanina, Irina; Sergeev, Sergey

    2017-04-01

    Aftershock sequences of earthquakes with magnitudes 5.0 and lower are difficult to detect and locate by sparse regional networks. Signals from aftershocks with magnitudes 2 to 3 are usually below detection thresholds of standard 3-C seismic stations at near regional distances. For seismic events close in space, the method waveform cross correlation (WCC) allows to reduce detection threshold by at least a unit of magnitude and to improve location precision to a few kilometers. Therefore, the WCC method is directly applicable to weak aftershock sequences. Here, we recover seismic activity after the earthquake near the town of Mariupol (Ukraine) occurred on August 7, 2016. The main shock was detected by many stations of the International monitoring system (IMS), including the closest primary IMS array stations AKASG (6.62 deg.) and BRTR (7.81), as well as 3-C station KBZ (5.00). The International data centre located this event (47.0013N, 37.5427E), estimated its origin time (08:15:4.1 UTC), magnitude (mb=4.5), and depth (6.8 km). This event was also detected by two array stations of the Institute for Dynamics of Geospheres (IDG) of the Russian Academy of Sciences: portable 3-C array RDON (3.28), which is the closest station, and MHVAR (7.96). Using signals from the main shock at five stations as waveform templates, we calculated continuous traces of cross correlation coefficient (CC) from the 7th to the 11th of August. We found that the best templates should include all regional phases, and thus, have the length from 80 s to 180 s. For detection, we used standard STA/LTA method with threshold depending on station. The accuracy of onset time estimation by the STA/LTA detector based on CC-traces is close to one sample, which varies from 0.05 s at BRTR to 0.005 s for RDON and MHVAR. Arrival times of all detected signals were reduced to origin times using the observed travel times from the main shock. Clusters of origin times are considered as event hypotheses in the

  14. Recurrence Interval and Event Age Data for Type A Faults

    USGS Publications Warehouse

    Dawson, Timothy E.; Weldon, Ray J.; Biasi, Glenn P.

    2008-01-01

    This appendix summarizes available recurrence interval, event age, and timing of most recent event data for Type A faults considered in the Earthquake Rate Model 2 (ERM 2) and used in the ERM 2 Appendix C analysis as well as Appendix N (time-dependent probabilities). These data have been compiled into an Excel workbook named Appendix B A-fault event ages_recurrence_V5.0 (herein referred to as the Appendix B workbook). For convenience, the Appendix B workbook is attached to the end of this document as a series of tables. The tables within the Appendix B workbook include site locations, event ages, and recurrence data, and in some cases, the interval of time between earthquakes is also reported. The Appendix B workbook is organized as individual worksheets, with each worksheet named by fault and paleoseismic site. Each worksheet contains the site location in latitude and longitude, as well as information on event ages, and a summary of recurrence data. Because the data has been compiled from different sources with different presentation styles, descriptions of the contents of each worksheet within the Appendix B spreadsheet are summarized.

  15. Learning from today's extreme weather events to increase our resilience to climate change

    NASA Astrophysics Data System (ADS)

    Ruin, I.; Lutoff, C.; Borga, M.; Creutin, J.-D.; Anquetin, S.; Gruntfest, E.; Scolobig, A.

    2009-04-01

    According to the IPCC, flooding is the most widespread serious potential impact of climate change on human settlement. Vulnerability to floods can be thought as a function of exposure and adaptive capacity, and all three entities have been increasing in many areas. Therefore, in order to inform decision-makers, it is crucial to better understand what are the vulnerability factors but also to what extend individuals and societies are capable to adapt their way of life to their changing environment. In this perspective, flash flood events offer a good example of the kind of extremes that our societies may have to face more often in the future. Characterized by their suddenness, fast and violent movement, rarity and small scale, they are particularly difficult to forecast accurately and leave very little lead-time for warnings. In this context, our interdisciplinary team conducts research focusing on individual and human organization responses to warning and crisis situations by using a comprehensive, coupled natural—human system approach over time and space scales. The objective is to understand i) what cognitive and situational factors help individuals and communities to shift from normal daily activities to adapted crisis response and ii) what is the dynamic of this process compared to the one of the natural phenomenon. In this regard, our research learned both from individual perception and behavioral intent survey ("what if" type of survey) than from actual behavioral data gathered in a context of post-event investigations. The review of the literature shows that behavioral intent surveys do not accurately predict warning and crisis response as well as behavioral data do. Knowing that, the difficulty is to obtain consistent and accurate spatio-temporal behavioral data. According to our experience, this is particularly difficult in the context of crisis situations. Behavioral verification requires real-time observations and data collection of indicators

  16. Evaluation of Tsunami-HySEA for tsunami forecasting at selected locations in U.S.

    NASA Astrophysics Data System (ADS)

    Gonzalez Vida, J. M., Sr.; Ortega, S.; Castro, M. J.; de la Asuncion, M.; Arcas, D.

    2017-12-01

    The GPU-based Tsunami-HySEA model (Macias, J. et al., Pure and Applied Geophysics, 1-37, 2017, Lynett, P. et al., Ocean modeling, 114, 2017) is used to test four tsunami events: the January, 13, 2007 earthquake in Kuril islands (Mw 8.1), the September, 29, 2009 earthquake in Samoa (Mw 8.3), the February, 27, 2010 earthquake in Chile (Mw 9.8) and the March, 11, 2011 earthquake in Tohoku (Mw 9.0). Initial conditions have been provided by NOAA Center for Tsunami Research (NCTR) obtained from DART inversion results. All simulations have been performed using a global 4 arc-min grid of the Ocean Pacific and three nested-meshes levels around the selected locations. Wave amplitudes time series have been computed at selected tide gauges located at each location and maximum amplitudes compared with both MOST model results and observations where they are available. In addition, inundation also has been computed at selected U.S. locations for the 2011 Tohoku and 2009 Samoa events under the assumption of a steady mean high water level. Finally, computational time is also evaluated in order to study the operational capabilities of Tsunami-HySEA for these kind of events. Ackowledgements: This work has been funded by WE133R16SE1418 contract between PMEL (NOAA) and the Universidad de Málaga (Spain).

  17. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    PubMed

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  18. Identifying elements of the plumbing system beneath Kilauea Volcano, Hawaii, from the source locations of very-long-period signals

    USGS Publications Warehouse

    Almendros, J.; Chouet, B.; Dawson, P.; Bond, T.

    2002-01-01

    We analyzed 16 seismic events recorded by the Hawaiian broad-band seismic network at Kilauca Volcano during the period September 9-26, 1999. Two distinct types of event are identified based on their spectral content, very-long-period (VLP) waveform, amplitude decay pattern and particle motion. We locate the VLP signals with a method based on analyses of semblance and particle motion. Different source regions are identified for the two event types. One source region is located at depths of ~1 km beneath the northeast edge of the Halemaumau pit crater. A second region is located at depths of ~8 km below the northwest quadrant of Kilauea caldera. Our study represents the first time that such deep sources have been identified in VLP data at Kilauea. This discovery opens the possibility of obtaining a detailed image of the location and geometry of the magma plumbing system beneath this volcano based on source locations and moment tensor inversions of VLP signals recorded by a permanent, large-aperture broad-band network.

  19. Emergency Physicians as Good Samaritans: Survey of Frequency, Locations, Supplies and Medications

    PubMed Central

    Burkholder, Taylor W.; King, Renee A.

    2016-01-01

    Introduction Little is known about the frequency and locations in which emergency physicians (EPs) are bystanders to an accident or emergency; equally uncertain is which contents of an “emergency kit” may be useful during such events. The aim of this study was to describe the frequency and locations of Good Samaritan acts by EPs and also determine which emergency kit supplies and medications were most commonly used by Good Samaritans. Methods We conducted an electronic survey among a convenience sample of EPs in Colorado. Results Respondents reported a median frequency of 2.0 Good Samaritan acts per five years of practice, with the most common locations being sports and entertainment events (25%), road traffic accidents (21%), and wilderness settings (19%). Of those who had acted as Good Samaritans, 86% reported that at least one supply would have been useful during the most recent event, and 66% reported at least one medication would have been useful. The most useful supplies were gloves (54%), dressings (34%), and a stethoscope (20%), while the most useful medications were oxygen (19%), intravenous fluids (17%), and epinephrine (14%). Conclusion The majority of EPs can expect to provide Good Samaritan care during their careers and would be better prepared by carrying a kit with common supplies and medications where they are most likely to use them. PMID:26823924

  20. 78 FR 29023 - Safety Zones; Annual Firework Displays Within the Captain of the Port, Puget Sound Area of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ... Responsibility AGENCY: Coast Guard, DHS. ACTION: Final rule. SUMMARY: The Coast Guard is adding three new fireworks events and correcting the location of five existing events outlined in 33 CFR 165.1332 to ensure... also adds three new firework display locations, and changes the title of the rule to accurately reflect...

  1. Automated vehicle location, data recording, friction measurement and applicator control for winter road maintenance.

    DOT National Transportation Integrated Search

    2010-02-01

    The first part of this project conducted a detailed evaluation of the ability of a new friction measurement system to : provide an accurate measure of road conditions. A system that records friction coefficient as a function of road : location was de...

  2. Trends in Attendance at Scoliosis Research Society Annual Meetings (SRS AM) and International Meeting on Advanced Spine Techniques (IMAST): Location, Location, Location.

    PubMed

    Chen, Foster; Cho, Woojin; Kim, Han Jo; Levine, David B

    2017-07-01

    Descriptive, respective. Although overall membership in Scoliosis Research Society (SRS) has grown over the years, we were curious to see the effects of changing event venue location and timing on conference attendance. Every year, the SRS hosts two major meetings: the Annual Meeting (SRS AM) in the autumn, and the International Meeting on Advanced Spine Techniques (IMAST) in the summer. Sites have alternated from within and outside North America. Often, these meetings have also overlapped with several holidays in certain countries. This was an observational study of attendance from past SRS AM and IMAST meetings. Fourteen years of AM and 8 years of IMAST data were made available from the SRS. Participation based on delegate type and countries were tallied. Details from the 10 most represented nations and host nations per year were also tallied, and their national holidays were reviewed for overlaps with the AM. Membership in AM and IMAST increased from 820 in 2003 to 1,323 in 2016. Attendance at the AM has increased, whereas attendance at IMAST has declined, even after adjusting for membership size. Trends in participation were highly influenced by location. Participation by attendees from the host continent, and especially the host country, is generally high. The negative impact of distant meetings is profoundly seen with North Americans, whereas the positive impact of a nearby meeting was mostly clearly demonstrated by South Americans. Although SRS AM overlapped with holidays in China, Japan, or Korea nearly 50% of the time, this did not influence participation by delegates from these countries. Participation in the AM is highly influenced by location. Although North Americans represented the largest constituency, their presence was not needed to drive total attendance and was not sufficient to turn around the downturn in IMAST attendance. Choice of location can encourage the participation of delegates from the host and neighboring nations; through strategic

  3. Modelling optimal location for pre-hospital helicopter emergency medical services.

    PubMed

    Schuurman, Nadine; Bell, Nathaniel J; L'Heureux, Randy; Hameed, Syed M

    2009-05-09

    Increasing the range and scope of early activation/auto launch helicopter emergency medical services (HEMS) may alleviate unnecessary injury mortality that disproportionately affects rural populations. To date, attempts to develop a quantitative framework for the optimal location of HEMS facilities have been absent. Our analysis used five years of critical care data from tertiary health care facilities, spatial data on origin of transport and accurate road travel time catchments for tertiary centres. A location optimization model was developed to identify where the expansion of HEMS would cover the greatest population among those currently underserved. The protocol was developed using geographic information systems (GIS) to measure populations, distances and accessibility to services. Our model determined Royal Inland Hospital (RIH) was the optimal site for an expanded HEMS - based on denominator population, distance to services and historical usage patterns. GIS based protocols for location of emergency medical resources can provide supportive evidence for allocation decisions - especially when resources are limited. In this study, we were able to demonstrate conclusively that a logical choice exists for location of additional HEMS. This protocol could be extended to location analysis for other emergency and health services.

  4. Location, Location, Location: Where Do Location-Based Services Fit into Your Institution's Social Media Mix?

    ERIC Educational Resources Information Center

    Nekritz, Tim

    2011-01-01

    Foursquare is a location-based social networking service that allows users to share their location with friends. Some college administrators have been thinking about whether and how to take the leap into location-based services, which are also known as geosocial networking services. These platforms, which often incorporate gaming elements like…

  5. Long Period (LP) volcanic earthquake source location at Merapi volcano by using dense array technics

    NASA Astrophysics Data System (ADS)

    Metaxian, Jean Philippe; Budi Santoso, Agus; Laurin, Antoine; Subandriyo, Subandriyo; Widyoyudo, Wiku; Arshab, Ghofar

    2015-04-01

    Since 2010, Merapi shows unusual activity compared to last decades. Powerful phreatic explosions are observed; some of them are preceded by LP signals. In the literature, LP seismicity is thought to be originated within the fluid, and therefore to be representative of the pressurization state of the volcano plumbing system. Another model suggests that LP events are caused by slow, quasi-brittle, low stress-drop failure driven by transient upper-edifice deformations. Knowledge of the spatial distribution of LP events is fundamental for better understanding the physical processes occurring in the conduit, as well as for the monitoring and the improvement of eruption forecasting. LP events recorded at Merapi have a spectral content dominated by frequencies between 0.8 and 3 Hz. To locate the source of these events, we installed a seismic antenna composed of 4 broadband CMG-6TD Güralp stations. This network has an aperture of 300 m. It is located on the site of Pasarbubar, between 500 and 800 m from the crater rim. Two multi-parameter stations (seismic, tiltmeter, S-P) located in the same area, equipped with broadband CMG-40T Güralp sensors may also be used to complete the data of the antenna. The source of LP events is located by using different approaches. In the first one, we used a method based on the measurement of the time delays between the early beginnings of LP events for each array receiver. The observed differences of time delays obtained for each pair of receivers are compared to theoretical values calculated from the travel times computed between grid nodes, which are positioned in the structure, and each receiver. In a second approach, we estimate the slowness vector by using MUSIC algorithm applied to 3-components data. From the slowness vector, we deduce the back-azimuth and the incident angle, which give an estimation of LP source depth in the conduit. This work is part of the Domerapi project funded by French Agence Nationale de la Recherche (https

  6. ACCURATE CHEMICAL MASTER EQUATION SOLUTION USING MULTI-FINITE BUFFERS

    PubMed Central

    Cao, Youfang; Terebus, Anna; Liang, Jie

    2016-01-01

    The discrete chemical master equation (dCME) provides a fundamental framework for studying stochasticity in mesoscopic networks. Because of the multi-scale nature of many networks where reaction rates have large disparity, directly solving dCMEs is intractable due to the exploding size of the state space. It is important to truncate the state space effectively with quantified errors, so accurate solutions can be computed. It is also important to know if all major probabilistic peaks have been computed. Here we introduce the Accurate CME (ACME) algorithm for obtaining direct solutions to dCMEs. With multi-finite buffers for reducing the state space by O(n!), exact steady-state and time-evolving network probability landscapes can be computed. We further describe a theoretical framework of aggregating microstates into a smaller number of macrostates by decomposing a network into independent aggregated birth and death processes, and give an a priori method for rapidly determining steady-state truncation errors. The maximal sizes of the finite buffers for a given error tolerance can also be pre-computed without costly trial solutions of dCMEs. We show exactly computed probability landscapes of three multi-scale networks, namely, a 6-node toggle switch, 11-node phage-lambda epigenetic circuit, and 16-node MAPK cascade network, the latter two with no known solutions. We also show how probabilities of rare events can be computed from first-passage times, another class of unsolved problems challenging for simulation-based techniques due to large separations in time scales. Overall, the ACME method enables accurate and efficient solutions of the dCME for a large class of networks. PMID:27761104

  7. Accurate chemical master equation solution using multi-finite buffers

    DOE PAGES

    Cao, Youfang; Terebus, Anna; Liang, Jie

    2016-06-29

    Here, the discrete chemical master equation (dCME) provides a fundamental framework for studying stochasticity in mesoscopic networks. Because of the multiscale nature of many networks where reaction rates have a large disparity, directly solving dCMEs is intractable due to the exploding size of the state space. It is important to truncate the state space effectively with quantified errors, so accurate solutions can be computed. It is also important to know if all major probabilistic peaks have been computed. Here we introduce the accurate CME (ACME) algorithm for obtaining direct solutions to dCMEs. With multifinite buffers for reducing the state spacemore » by $O(n!)$, exact steady-state and time-evolving network probability landscapes can be computed. We further describe a theoretical framework of aggregating microstates into a smaller number of macrostates by decomposing a network into independent aggregated birth and death processes and give an a priori method for rapidly determining steady-state truncation errors. The maximal sizes of the finite buffers for a given error tolerance can also be precomputed without costly trial solutions of dCMEs. We show exactly computed probability landscapes of three multiscale networks, namely, a 6-node toggle switch, 11-node phage-lambda epigenetic circuit, and 16-node MAPK cascade network, the latter two with no known solutions. We also show how probabilities of rare events can be computed from first-passage times, another class of unsolved problems challenging for simulation-based techniques due to large separations in time scales. Overall, the ACME method enables accurate and efficient solutions of the dCME for a large class of networks.« less

  8. Accurate chemical master equation solution using multi-finite buffers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Youfang; Terebus, Anna; Liang, Jie

    Here, the discrete chemical master equation (dCME) provides a fundamental framework for studying stochasticity in mesoscopic networks. Because of the multiscale nature of many networks where reaction rates have a large disparity, directly solving dCMEs is intractable due to the exploding size of the state space. It is important to truncate the state space effectively with quantified errors, so accurate solutions can be computed. It is also important to know if all major probabilistic peaks have been computed. Here we introduce the accurate CME (ACME) algorithm for obtaining direct solutions to dCMEs. With multifinite buffers for reducing the state spacemore » by $O(n!)$, exact steady-state and time-evolving network probability landscapes can be computed. We further describe a theoretical framework of aggregating microstates into a smaller number of macrostates by decomposing a network into independent aggregated birth and death processes and give an a priori method for rapidly determining steady-state truncation errors. The maximal sizes of the finite buffers for a given error tolerance can also be precomputed without costly trial solutions of dCMEs. We show exactly computed probability landscapes of three multiscale networks, namely, a 6-node toggle switch, 11-node phage-lambda epigenetic circuit, and 16-node MAPK cascade network, the latter two with no known solutions. We also show how probabilities of rare events can be computed from first-passage times, another class of unsolved problems challenging for simulation-based techniques due to large separations in time scales. Overall, the ACME method enables accurate and efficient solutions of the dCME for a large class of networks.« less

  9. The Subread aligner: fast, accurate and scalable read mapping by seed-and-vote

    PubMed Central

    Liao, Yang; Smyth, Gordon K.; Shi, Wei

    2013-01-01

    Read alignment is an ongoing challenge for the analysis of data from sequencing technologies. This article proposes an elegantly simple multi-seed strategy, called seed-and-vote, for mapping reads to a reference genome. The new strategy chooses the mapped genomic location for the read directly from the seeds. It uses a relatively large number of short seeds (called subreads) extracted from each read and allows all the seeds to vote on the optimal location. When the read length is <160 bp, overlapping subreads are used. More conventional alignment algorithms are then used to fill in detailed mismatch and indel information between the subreads that make up the winning voting block. The strategy is fast because the overall genomic location has already been chosen before the detailed alignment is done. It is sensitive because no individual subread is required to map exactly, nor are individual subreads constrained to map close by other subreads. It is accurate because the final location must be supported by several different subreads. The strategy extends easily to find exon junctions, by locating reads that contain sets of subreads mapping to different exons of the same gene. It scales up efficiently for longer reads. PMID:23558742

  10. Event models and the fan effect.

    PubMed

    Radvansky, G A; O'Rear, Andrea E; Fisher, Jerry S

    2017-08-01

    The current study explored the persistence of event model organizations and how this influences the experience of interference during retrieval. People in this study memorized lists of sentences about objects in locations, such as "The potted palm is in the hotel." Previous work has shown that such information can either be stored in separate event models, thereby producing retrieval interference, or integrated into common event models, thereby eliminating retrieval interference. Unlike prior studies, the current work explored the impact of forgetting up to 2 weeks later on this pattern of performance. We explored three possible outcomes across the various retention intervals. First, consistent with research showing that longer delays reduce proactive and retroactive interference, any retrieval interference effects of competing event models could be reduced over time. Second, the binding of information into events models may weaken over time, causing interference effects to emerge when they had previously been absent. Third, and finally, the organization of information into event models could remain stable over long periods of time. The results reported here are most consistent with the last outcome. While there were some minor variations across the various retention intervals, the basic pattern of event model organization remained preserved over the two-week retention period.

  11. NASA EM Followup of LIGO-Virgo Candidate Events

    NASA Technical Reports Server (NTRS)

    Blackburn, Lindy L.

    2011-01-01

    We present a strategy for a follow-up of LIGO-Virgo candidate events using offline survey data from several NASA high-energy photon instruments aboard RXTE, Swift, and Fermi. Time and sky-location information provided by the GW trigger allows for a targeted search for prompt and afterglow EM signals. In doing so, we expect to be sensitive to signals which are too weak to be publicly reported as astrophysical EM events.

  12. Analyzing Electric Field Morphology Through Data-Model Comparisons of the GEM IM/S Assessment Challenge Events

    NASA Technical Reports Server (NTRS)

    Liemohn, Michael W.; Ridley, Aaron J.; Kozyra, Janet U.; Gallagher, Dennis L.; Thomsen, Michelle F.; Henderson, Michael G.; Denton, Michael H.; Brandt, Pontus C.; Goldstein, Jerry

    2006-01-01

    The storm-time inner magnetospheric electric field morphology and dynamics are assessed by comparing numerical modeling results of the plasmasphere and ring current with many in situ and remote sensing data sets. Two magnetic storms are analyzed, April 22,2001 and October 21-23,2001, which are the events selected for the Geospace Environment Modeling (GEM) Inner Magnetosphere/Storms (IM/S) Assessment Challenge (IMSAC). The IMSAC seeks to quantify the accuracy of inner magnetospheric models as well as synthesize our understanding of this region. For each storm, the ring current-atmosphere interaction model (RAM) and the dynamic global core plasma model (DGCPM) were run together with various settings for the large-scale convection electric field and the nightside ionospheric conductance. DGCPM plasmaspheric parameters were compared with IMAGE-EUV plasmapause extractions and LANL-MPA plume locations and velocities. RAM parameters were compared with Dst*, LANL-MPA fluxes and moments, IMAGE-MENA images, and IMAGE-HENA images. Both qualitative and quantitative comparisons were made to determine the electric field morphology that allows the model results to best fit the plasma data at various times during these events. The simulations with self-consistent electric fields were, in general, better than those with prescribed field choices. This indicates that the time-dependent modulation of the inner magnetospheric electric fields by the nightside ionosphere is quite significant for accurate determination of these fields (and their effects). It was determined that a shielded Volland-Stern field description driven by the 3-hour Kp index yields accurate results much of the time, but can be quite inconsistent. The modified Mcllwain field description clearly lagged in overall accuracy compared to the other fields, but matched some data sets (like Dst*) quite well. The rankings between the simulations varied depending on the storm and the individual data sets, indicating that

  13. Online location of a break in water distribution systems

    NASA Astrophysics Data System (ADS)

    Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei

    2003-08-01

    Breaks often occur to urban water distribution systems under severely cold weather, or due to corrosion of pipes, deformation of ground, etc., and the breaks cannot easily be located, especially immediately after the events. This paper develops a methodology to locate a break in a water distribution system by monitoring water pressure online at some nodes in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the break based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides a quick, effective, and practical way in which a break in a water distribution system can be located.

  14. Supervised machine learning on a network scale: application to seismic event classification and detection

    NASA Astrophysics Data System (ADS)

    Reynen, Andrew; Audet, Pascal

    2017-09-01

    A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.

  15. The effects of vent location, event scale and time forecasts on pyroclastic density current hazard maps at Campi Flegrei caldera (Italy)

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Bisson, Marina; Esposti Ongaro, Tomaso; Flandoli, Franco; Isaia, Roberto; Rosi, Mauro; Vitale, Stefano

    2017-09-01

    This study presents a new method for producing long-term hazard maps for pyroclastic density currents (PDC) originating at Campi Flegrei caldera. Such method is based on a doubly stochastic approach and is able to combine the uncertainty assessments on the spatial location of the volcanic vent, the size of the flow and the expected time of such an event. The results are obtained by using a Monte Carlo approach and adopting a simplified invasion model based on the box model integral approximation. Temporal assessments are modelled through a Cox-type process including self-excitement effects, based on the eruptive record of the last 15 kyr. Mean and percentile maps of PDC invasion probability are produced, exploring their sensitivity to some sources of uncertainty and to the effects of the dependence between PDC scales and the caldera sector where they originated. Conditional maps representative of PDC originating inside limited zones of the caldera, or of PDC with a limited range of scales are also produced. Finally, the effect of assuming different time windows for the hazard estimates is explored, also including the potential occurrence of a sequence of multiple events. Assuming that the last eruption of Monte Nuovo (A.D. 1538) marked the beginning of a new epoch of activity similar to the previous ones, results of the statistical analysis indicate a mean probability of PDC invasion above 5% in the next 50 years on almost the entire caldera (with a probability peak of 25% in the central part of the caldera). In contrast, probability values reduce by a factor of about 3 if the entire eruptive record is considered over the last 15 kyr, i.e. including both eruptive epochs and quiescent periods.

  16. Accurate interlaminar stress recovery from finite element analysis

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Riggs, H. Ronald

    1994-01-01

    The accuracy and robustness of a two-dimensional smoothing methodology is examined for the problem of recovering accurate interlaminar shear stress distributions in laminated composite and sandwich plates. The smoothing methodology is based on a variational formulation which combines discrete least-squares and penalty-constraint functionals in a single variational form. The smoothing analysis utilizes optimal strains computed at discrete locations in a finite element analysis. These discrete strain data are smoothed with a smoothing element discretization, producing superior accuracy strains and their first gradients. The approach enables the resulting smooth strain field to be practically C1-continuous throughout the domain of smoothing, exhibiting superconvergent properties of the smoothed quantity. The continuous strain gradients are also obtained directly from the solution. The recovered strain gradients are subsequently employed in the integration o equilibrium equations to obtain accurate interlaminar shear stresses. The problem is a simply-supported rectangular plate under a doubly sinusoidal load. The problem has an exact analytic solution which serves as a measure of goodness of the recovered interlaminar shear stresses. The method has the versatility of being applicable to the analysis of rather general and complex structures built of distinct components and materials, such as found in aircraft design. For these types of structures, the smoothing is achieved with 'patches', each patch covering the domain in which the smoothed quantity is physically continuous.

  17. Aftershocks and triggered events of the Great 1906 California earthquake

    USGS Publications Warehouse

    Meltzner, A.J.; Wald, D.J.

    2003-01-01

    The San Andreas fault is the longest fault in California and one of the longest strike-slip faults in the world, yet little is known about the aftershocks following the most recent great event on the San Andreas, the Mw 7.8 San Francisco earthquake on 18 April 1906. We conducted a study to locate and to estimate magnitudes for the largest aftershocks and triggered events of this earthquake. We examined existing catalogs and historical documents for the period April 1906 to December 1907, compiling data on the first 20 months of the aftershock sequence. We grouped felt reports temporally and assigned modified Mercalli intensities for the larger events based on the descriptions judged to be the most reliable. For onshore and near-shore events, a grid-search algorithm (derived from empirical analysis of modern earthquakes) was used to find the epicentral location and magnitude most consistent with the assigned intensities. For one event identified as far offshore, the event's intensity distribution was compared with those of modern events, in order to contrain the event's location and magnitude. The largest aftershock within the study period, an M ???6.7 event, occurred ???100 km west of Eureka on 23 April 1906. Although not within our study period, another M ???6.7 aftershock occurred near Cape Mendocino on 28 October 1909. Other significant aftershocks included an M ???5.6 event near San Juan Bautista on 17 May 1906 and an M ???6.3 event near Shelter Cove on 11 August 1907. An M ???4.9 aftershock occurred on the creeping segment of the San Andreas fault (southeast of the mainshock rupture) on 6 July 1906. The 1906 San Francisco earthquake also triggered events in southern California (including separate events in or near the Imperial Valley, the Pomona Valley, and Santa Monica Bay), in western Nevada, in southern central Oregon, and in western Arizona, all within 2 days of the mainshock. Of these trigerred events, the largest were an M ???6.1 earthquake near Brawley

  18. An event database for rotational seismology

    NASA Astrophysics Data System (ADS)

    Salvermoser, Johannes; Hadziioannou, Celine; Hable, Sarah; Chow, Bryant; Krischer, Lion; Wassermann, Joachim; Igel, Heiner

    2016-04-01

    The ring laser sensor (G-ring) located at Wettzell, Germany, routinely observes earthquake-induced rotational ground motions around a vertical axis since its installation in 2003. Here we present results from a recently installed event database which is the first that will provide ring laser event data in an open access format. Based on the GCMT event catalogue and some search criteria, seismograms from the ring laser and the collocated broadband seismometer are extracted and processed. The ObsPy-based processing scheme generates plots showing waveform fits between rotation rate and transverse acceleration and extracts characteristic wavefield parameters such as peak ground motions, noise levels, Love wave phase velocities and waveform coherence. For each event, these parameters are stored in a text file (json dictionary) which is easily readable and accessible on the website. The database contains >10000 events starting in 2007 (Mw>4.5). It is updated daily and therefore provides recent events at a time lag of max. 24 hours. The user interface allows to filter events for epoch, magnitude, and source area, whereupon the events are displayed on a zoomable world map. We investigate how well the rotational motions are compatible with the expectations from the surface wave magnitude scale. In addition, the website offers some python source code examples for downloading and processing the openly accessible waveforms.

  19. Location Sensitive Deep Convolutional Neural Networks for Segmentation of White Matter Hyperintensities.

    PubMed

    Ghafoorian, Mohsen; Karssemeijer, Nico; Heskes, Tom; van Uden, Inge W M; Sanchez, Clara I; Litjens, Geert; de Leeuw, Frank-Erik; van Ginneken, Bram; Marchiori, Elena; Platel, Bram

    2017-07-11

    The anatomical location of imaging features is of crucial importance for accurate diagnosis in many medical tasks. Convolutional neural networks (CNN) have had huge successes in computer vision, but they lack the natural ability to incorporate the anatomical location in their decision making process, hindering success in some medical image analysis tasks. In this paper, to integrate the anatomical location information into the network, we propose several deep CNN architectures that consider multi-scale patches or take explicit location features while training. We apply and compare the proposed architectures for segmentation of white matter hyperintensities in brain MR images on a large dataset. As a result, we observe that the CNNs that incorporate location information substantially outperform a conventional segmentation method with handcrafted features as well as CNNs that do not integrate location information. On a test set of 50 scans, the best configuration of our networks obtained a Dice score of 0.792, compared to 0.805 for an independent human observer. Performance levels of the machine and the independent human observer were not statistically significantly different (p-value = 0.06).

  20. A Double-difference Earthquake location algorithm: Method and application to the Northern Hayward Fault, California

    USGS Publications Warehouse

    Waldhauser, F.; Ellsworth, W.L.

    2000-01-01

    We have developed an efficient method to determine high-resolution hypocenter locations over large distances. The location method incorporates ordinary absolute travel-time measurements and/or cross-correlation P-and S-wave differential travel-time measurements. Residuals between observed and theoretical travel-time differences (or double-differences) are minimized for pairs of earthquakes at each station while linking together all observed event-station pairs. A least-squares solution is found by iteratively adjusting the vector difference between hypocentral pairs. The double-difference algorithm minimizes errors due to unmodeled velocity structure without the use of station corrections. Because catalog and cross-correlation data are combined into one system of equations, interevent distances within multiplets are determined to the accuracy of the cross-correlation data, while the relative locations between multiplets and uncorrelated events are simultaneously determined to the accuracy of the absolute travel-time data. Statistical resampling methods are used to estimate data accuracy and location errors. Uncertainties in double-difference locations are improved by more than an order of magnitude compared to catalog locations. The algorithm is tested, and its performance is demonstrated on two clusters of earthquakes located on the northern Hayward fault, California. There it colapses the diffuse catalog locations into sharp images of seismicity and reveals horizontal lineations of hypocenter that define the narrow regions on the fault where stress is released by brittle failure.

  1. Location of early aftershocks of the 2004 Mid-Niigata Prefecture Earthquake (M = 6.8) in central Japan using seismogram envelopes as templates

    NASA Astrophysics Data System (ADS)

    Kosuga, M.

    2013-12-01

    The location of early aftershocks is very important to obtain information of mainshock fault, however, it is often difficult due to the long-lasting coda wave of mainshock and successive occurrence of afterrshocks. To overcome this difficulty, we developed a method of location using seismogram envelopes as templates, and applied the method to the early aftershock sequence of the 2004 Mid-Niigata Prefecture (Chuetsu) Earthquake (M = 6.8) in central Japan. The location method composes of three processes. The first process is the calculation of cross-correlation coefficients between a continuous (target) and template envelopes. We prepare envelopes by taking the logarithm of root-mean-squared amplitude of band-pass filtered seismograms. We perform the calculation by shifting the time window to obtain a set of cross-correlation values for each template. The second process is the event detection (selection of template) and magnitude estimate. We search for the events in descending order of cross-correlation in a time window excluding the dead times around the previously detected events. Magnitude is calculated by the amplitude ratio of target and template envelopes. The third process is the relative event location to the selected template. We applied this method to the Chuetsu earthquake, a large inland earthquake with extensive aftershock activity. The number of detected events depends on the number of templates, frequency range, and the threshold value of cross-correlation. We set the threshold as 0.5 by referring to the histogram of cross-correlation. During a period of one-hour from the mainshock, we could detect more events than the JMA catalog. The location of events is generally near the catalog location. Though we should improve the methods of relative location and magnitude estimate, we conclude that the proposed method works adequately even just after the mainshock of large inland earthquake. Acknowledgement: We thank JMA, NIED, and the University of Tokyo for

  2. 78 FR 8063 - Safety Zones; Annual Firework Displays Within the Captain of the Port, Puget Sound Area of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... proposes to add three new fireworks events and to correct the location of five existing events to ensure... display locations have been added to area, and the title of the rule does not accurately reflect what is.... This rule proposes to add the following firework displays: Port Ludlow Fireworks, latitude 47[deg] 55...

  3. Heavy precipitation events in northern Switzerland

    NASA Astrophysics Data System (ADS)

    Giannakaki, Paraskevi; Martius, Olivia

    2013-04-01

    Heavy precipitation events in the Alpine region often cause floods, rock-falls and mud slides with severe consequences for population and economy. Breaking synoptic Rossby waves located over western Europe, play a central role in triggering such heavy rain events in southern Switzerland (e.g. Massacand et al. 1998). In contrast, synoptic scale structures triggering heavy precipitation on the north side of the Swiss Alps and orographic effects have so far not been studied comprehensively. An observation based high resolution precipitation data set for Switzerland and the Alps (MeteoSwiss) is used to identify heavy precipitation events affecting the north side of the Swiss Alps for the time period 1961-2010. For these events a detailed statistical and dynamical analysis of the upper level flow is conducted using ECMWFs ERA-40 and ERA-Interim reanalysis data sets. For the analysis north side of the Swiss Alps is divided in two investigation areas north-eastern and western Switzerland following the Swiss climate change scenarios (Bey et al. 2011). A subjective classification of upper level structures triggering heavy precipitation events in the areas of interest is presented. Four classes are defined based on the orientation and formation of the dynamical tropopause during extreme events in the northern part of Switzerland and its sub-regions. The analysis is extended by a climatology of breaking waves and cut-offs following the method of Wernli and Sprenger (2007) to examine their presence and location during extreme events. References Bey I., Croci-Maspoli M., Fuhrer J., Kull C, Appenzeller C., Knutti R. and Schär C. Swiss Climate Change Scenarios CH2011, C2SM, MeteoSwiss, ETH, NCCR Climate, OcCC (2011), http://dx.doi.org/10.3929/ethz-a-006720559 Massacand A., H. Wernli, and H.C. Davies, 1998. Heavy precipitation on the Alpine South side: An upper-level precursor. Geophys. Res. Lett., 25, 1435-1438. MeteoSwiss 2011. Documentation of Meteoswiss grid-data products

  4. Improved ultrasound transducer positioning by fetal heart location estimation during Doppler based heart rate measurements.

    PubMed

    Hamelmann, Paul; Vullings, Rik; Schmitt, Lars; Kolen, Alexander F; Mischi, Massimo; van Laar, Judith O E H; Bergmans, Jan W M

    2017-09-21

    Doppler ultrasound (US) is the most commonly applied method to measure the fetal heart rate (fHR). When the fetal heart is not properly located within the ultrasonic beam, fHR measurements often fail. As a consequence, clinical staff need to reposition the US transducer on the maternal abdomen, which can be a time consuming and tedious task. In this article, a method is presented to aid clinicians with the positioning of the US transducer to produce robust fHR measurements. A maximum likelihood estimation (MLE) algorithm is developed, which provides information on fetal heart location using the power of the Doppler signals received in the individual elements of a standard US transducer for fHR recordings. The performance of the algorithm is evaluated with simulations and in vitro experiments performed on a beating-heart setup. Both the experiments and the simulations show that the heart location can be accurately determined with an error of less than 7 mm within the measurement volume of the employed US transducer. The results show that the developed algorithm can be used to provide accurate feedback on fetal heart location for improved positioning of the US transducer, which may lead to improved measurements of the fHR.

  5. Error properties of Argos satellite telemetry locations using least squares and Kalman filtering.

    PubMed

    Boyd, Janice D; Brightsmith, Donald J

    2013-01-01

    Study of animal movements is key for understanding their ecology and facilitating their conservation. The Argos satellite system is a valuable tool for tracking species which move long distances, inhabit remote areas, and are otherwise difficult to track with traditional VHF telemetry and are not suitable for GPS systems. Previous research has raised doubts about the magnitude of position errors quoted by the satellite service provider CLS. In addition, no peer-reviewed publications have evaluated the usefulness of the CLS supplied error ellipses nor the accuracy of the new Kalman filtering (KF) processing method. Using transmitters hung from towers and trees in southeastern Peru, we show the Argos error ellipses generally contain <25% of the true locations and therefore do not adequately describe the true location errors. We also find that KF processing does not significantly increase location accuracy. The errors for both LS and KF processing methods were found to be lognormally distributed, which has important repercussions for error calculation, statistical analysis, and data interpretation. In brief, "good" positions (location codes 3, 2, 1, A) are accurate to about 2 km, while 0 and B locations are accurate to about 5-10 km. However, due to the lognormal distribution of the errors, larger outliers are to be expected in all location codes and need to be accounted for in the user's data processing. We evaluate five different empirical error estimates and find that 68% lognormal error ellipses provided the most useful error estimates. Longitude errors are larger than latitude errors by a factor of 2 to 3, supporting the use of elliptical error ellipses. Numerous studies over the past 15 years have also found fault with the CLS-claimed error estimates yet CLS has failed to correct their misleading information. We hope this will be reversed in the near future.

  6. Dynamics of pollutant indicators during flood events in a small river under strong anthropogenic pressures

    NASA Astrophysics Data System (ADS)

    Brion, Natacha; Carbonnel, Vincent; Elskens, Marc; Claeys, Philippe; Verbanck, Michel A.

    2017-04-01

    In densely populated regions, human activities profoundly modify natural water circulation as well as water quality, with increased hydrological risks (floods, droughts,…) and chemical hazards (untreated sewage releases, industrial pollution,…) as consequence. In order to assess water and pollutants dynamics and their mass-balance in strongly modified river system, it is important to take into account high flow events as a significant fraction of water and pollutants loads may occur during these short events which are generally underrepresented in classical mass balance studies. A good example of strongly modified river systems is the Zenne river in and around the city of Brussels (Belgium).The Zenne River (Belgium) is a rather small but dynamic rain fed river (about 10 m3/s in average) that is under the influence of strong contrasting anthropogenic pressures along its stretch. While the upstream part of its basin is rather characterized by agricultural land-use, urban and industrial areas dominate the downstream part. In particular, the city of Brussels (1.1M inhabitants) discharges in the Zenne River amounts of wastewater that are large compared to the natural riverine flow. In order to assess water and pollutants dynamics and their mass-balance in the Zenne hydrographic network, we followed water flows and concentrations of several water quality tracers during several flood episodes with an hourly frequency and at different locations along the stretch of the River. These parameters were chosen as indicators of a whole range of pollutions and anthropogenic activities. Knowledge of the high-frequency pollutants dynamics during floods is required for establishing accurate mass-balances of these elements. We thus report here the dynamics of selected parameters during entire flood events, from the baseline to the decreasing phase and at hourly frequency. Dynamics at contrasting locations, in agricultural or urban environments are compared. In particular, the

  7. Is GPS telemetry location error screening beneficial?

    USGS Publications Warehouse

    Ironside, Kirsten E.; Mattson, David J.; Arundel, Terry; Hansen, Jered R.

    2017-01-01

    The accuracy of global positioning system (GPS) locations obtained from study animals tagged with GPS monitoring devices has been a concern as to the degree it influences assessments of movement patterns, space use, and resource selection estimates. Many methods have been proposed for screening data to retain the most accurate positions for analysis, based on dilution of precision (DOP) measures, and whether the position is a two dimensional or three dimensional fix. Here we further explore the utility of these measures, by testing a Telonics GEN3 GPS collar's positional accuracy across a wide range of environmental conditions. We found the relationship between location error and fix dimension and DOP metrics extremely weak (r2adj ∼ 0.01) in our study area. Environmental factors such as topographic exposure, canopy cover, and vegetation height explained more of the variance (r2adj = 15.08%). Our field testing covered sites where sky-view was so limited it affected GPS performance to the degree fix attempts failed frequently (fix success rates ranged 0.00–100.00% over 67 sites). Screening data using PDOP did not effectively reduce the location error in the remaining dataset. Removing two dimensional fixes reduced the mean location error by 10.95 meters, but also resulted in a 54.50% data reduction. Therefore screening data under the range of conditions sampled here would reduce information on animal movement with minor improvements in accuracy and potentially introduce bias towards more open terrain and vegetation.

  8. Automated identification and modeling aseismic slip events on Kilauea Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Desmarais, E. K.; Segall, P.; Miklius, A.

    2006-12-01

    .5 - 8.5 km, thus constrain the depth of the slow earthquakes to comparable depths. The triggering of microearthquakes implies that there is finite probability that a larger earthquake could be triggered, given appropriate stress conditions. In order to better constrain the locations of the slow slip events based solely on geodetic observations, we expand on the simple uniform slip models by adding the effects of distributed slip, layered elastic structure, and topography. There are many difficulties in observing slow slip events on Kilauea volcano. The GPS network only provides displacements on land, which is primarily to the north of the largest slip. The vertical displacement field is essential to understanding the northward extent of the slip, however, the GPS observations of slow slip events are primarily observed in the horizontal component, which have smaller noise levels (~ 3 mm). The maximum vertical deformation from the largest event (2005) was very small (± 9 mm), about the same size as the typical vertical noise. We are exploring the possibility that tiltmeters will allow sufficiently accurate measurements to help identify the northern extent of the slip surface.

  9. Cascading events in linked ecological and socioeconomic systems

    USGS Publications Warehouse

    Peters, Debra P.C.; Sala, O.E.; Allen, Craig D.; Covich, A.; Brunson, M.

    2007-01-01

    Cascading events that start at small spatial scales and propagate non-linearly through time to influence larger areas often have major impacts on ecosystem goods and services. Events such as wildfires and hurricanes are increasing in frequency and magnitude as systems become more connected through globalization processes. We need to improve our understanding of these events in order to predict their occurrence, minimize potential impacts, and allow for strategic recovery. Here, we synthesize information about cascading events in systems located throughout the Americas. We discuss a variety of examples of cascading events that share a common feature: they are often driven by linked ecological and human processes across scales. In this era of globalization, we recommend studies that explicitly examine connections across scales and examine the role of connectivity among non-contiguous as well as contiguous areas.

  10. Identification and location of catenary insulator in complex background based on machine vision

    NASA Astrophysics Data System (ADS)

    Yao, Xiaotong; Pan, Yingli; Liu, Li; Cheng, Xiao

    2018-04-01

    It is an important premise to locate insulator precisely for fault detection. Current location algorithms for insulator under catenary checking images are not accurate, a target recognition and localization method based on binocular vision combined with SURF features is proposed. First of all, because of the location of the insulator in complex environment, using SURF features to achieve the coarse positioning of target recognition; then Using binocular vision principle to calculate the 3D coordinates of the object which has been coarsely located, realization of target object recognition and fine location; Finally, Finally, the key is to preserve the 3D coordinate of the object's center of mass, transfer to the inspection robot to control the detection position of the robot. Experimental results demonstrate that the proposed method has better recognition efficiency and accuracy, can successfully identify the target and has a define application value.

  11. An Accurate Co-registration Method for Airborne Repeat-pass InSAR

    NASA Astrophysics Data System (ADS)

    Dong, X. T.; Zhao, Y. H.; Yue, X. J.; Han, C. M.

    2017-10-01

    Interferometric Synthetic Aperture Radar (InSAR) technology plays a significant role in topographic mapping and surface deformation detection. Comparing with spaceborne repeat-pass InSAR, airborne repeat-pass InSAR solves the problems of long revisit time and low-resolution images. Due to the advantages of flexible, accurate, and fast obtaining abundant information, airborne repeat-pass InSAR is significant in deformation monitoring of shallow ground. In order to getting precise ground elevation information and interferometric coherence of deformation monitoring from master and slave images, accurate co-registration must be promised. Because of side looking, repeat observing path and long baseline, there are very different initial slant ranges and flight heights between repeat flight paths. The differences of initial slant ranges and flight height lead to the pixels, located identical coordinates on master and slave images, correspond to different size of ground resolution cells. The mismatching phenomenon performs very obvious on the long slant range parts of master image and slave image. In order to resolving the different sizes of pixels and getting accurate co-registration results, a new method is proposed based on Range-Doppler (RD) imaging model. VV-Polarization C-band airborne repeat-pass InSAR images were used in experiment. The experiment result shows that the proposed method leads to superior co-registration accuracy.

  12. Event-Based User Classification in Weibo Media

    PubMed Central

    Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  13. Large arterial occlusive strokes as a medical emergency: need to accurately predict clot location.

    PubMed

    Vanacker, Peter; Faouzi, Mohamed; Eskandari, Ashraf; Maeder, Philippe; Meuli, Reto; Michel, Patrik

    2017-10-01

    Endovascular treatment for acute ischemic stroke with a large intracranial occlusion was recently shown to be effective. Timely knowledge of the presence, site, and extent of arterial occlusions in the ischemic territory has the potential to influence patient selection for endovascular treatment. We aimed to find predictors of large vessel occlusive strokes, on the basis of available demographic, clinical, radiological, and laboratory data in the emergency setting. Patients enrolled in ASTRAL registry with acute ischemic stroke and computed tomography (CT)-angiography within 12 h of stroke onset were selected and categorized according to occlusion site. Easily accessible variables were used in a multivariate analysis. Of 1645 patients enrolled, a significant proportion (46.2%) had a large vessel occlusion in the ischemic territory. The main clinical predictors of any arterial occlusion were in-hospital stroke [odd ratios (OR) 2.1, 95% confidence interval 1.4-3.1], higher initial National Institute of Health Stroke Scale (OR 1.1, 1.1-1.2), presence of visual field defects (OR 1.9, 1.3-2.6), dysarthria (OR 1.4, 1.0-1.9), or hemineglect (OR 2.0, 1.4-2.8) at admission and atrial fibrillation (OR 1.7, 1.2-2.3). Further, the following radiological predictors were identified: time-to-imaging (OR 0.9, 0.9-1.0), early ischemic changes (OR 2.3, 1.7-3.2), and silent lesions on CT (OR 0.7, 0.5-1.0). The area under curve for this analysis was 0.85. Looking at different occlusion sites, National Institute of Health Stroke Scale and early ischemic changes on CT were independent predictors in all subgroups. Neurological deficits, stroke risk factors, and CT findings accurately identify acute ischemic stroke patients at risk of symptomatic vessel occlusion. Predicting the presence of these occlusions may impact emergency stroke care in regions with limited access to noninvasive vascular imaging.

  14. Out of place, out of mind: Schema-driven false memory effects for object-location bindings.

    PubMed

    Lew, Adina R; Howe, Mark L

    2017-03-01

    Events consist of diverse elements, each processed in specialized neocortical networks, with temporal lobe memory systems binding these elements to form coherent event memories. We provide a novel theoretical analysis of an unexplored consequence of the independence of memory systems for elements and their bindings, 1 that raises the paradoxical prediction that schema-driven false memories can act solely on the binding of event elements despite the superior retrieval of individual elements. This is because if 2, or more, schema-relevant elements are bound together in unexpected conjunctions, the unexpected conjunction will increase attention during encoding to both the elements and their bindings, but only the bindings will receive competition with evoked schema-expected bindings. We test our model by examining memory for object-location bindings in recognition (Study 1) and recall (Studies 2 and 3) tasks. After studying schema-relevant objects in unexpected locations (e.g., pan on a stool in a kitchen scene), participants who then viewed these objects in expected locations (e.g., pan on stove) at test were more likely to falsely remember this object-location pairing as correct, compared with participants that viewed a different unexpected object-location pairing (e.g., pan on floor). In recall, participants were more likely to correctly remember individual schema-relevant objects originally viewed in unexpected, as opposed to expected locations, but were then more likely to misplace these items in the original room scene to expected places, relative to control schema-irrelevant objects. Our theoretical analysis and novel paradigm provide a tool for investigating memory distortions acting on binding processes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Accurate Sample Time Reconstruction of Inertial FIFO Data.

    PubMed

    Stieber, Sebastian; Dorsch, Rainer; Haubelt, Christian

    2017-12-13

    In the context of modern cyber-physical systems, the accuracy of underlying sensor data plays an increasingly important role in sensor data fusion and feature extraction. The raw events of multiple sensors have to be aligned in time to enable high quality sensor fusion results. However, the growing number of simultaneously connected sensor devices make the energy saving data acquisition and processing more and more difficult. Hence, most of the modern sensors offer a first-in-first-out (FIFO) interface to store multiple data samples and to relax timing constraints, when handling multiple sensor devices. However, using the FIFO interface increases the negative influence of individual clock drifts-introduced by fabrication inaccuracies, temperature changes and wear-out effects-onto the sampling data reconstruction. Furthermore, additional timing offset errors due to communication and software latencies increases with a growing number of sensor devices. In this article, we present an approach for an accurate sample time reconstruction independent of the actual clock drift with the help of an internal sensor timer. Such timers are already available in modern sensors, manufactured in micro-electromechanical systems (MEMS) technology. The presented approach focuses on calculating accurate time stamps using the sensor FIFO interface in a forward-only processing manner as a robust and energy saving solution. The proposed algorithm is able to lower the overall standard deviation of reconstructed sampling periods below 40 μ s, while run-time savings of up to 42% are achieved, compared to single sample acquisition.

  17. Engineering assessment of low-level liquid waste disposal caisson locations at the 618-11 Burial Grounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, S.J.; Fischer, D.D.; Crawford, R.C.

    1982-06-01

    Rockwell Hanford Operations is currently involved in an extensive effort to perform interim ground surface stabilization activities at retired low-level waste burial grounds located at the Hanford Site, Richland, Washington. The principal objective of these activities is to promote increased occupational and radiological safety at burial grounds. Interim stabilization activities include: (1) load testing (traversing burial ground surfaces with heavy equipment to promote incipient collapse of void spaces within the disposal structure and overburden), (2) barrier placement (placement of a {ge} 0.6 m soil barrier over existing overburden), and (3) revegetation (establishment of shallow rooted vegetation on the barrier tomore » mitigate deep rooted plant growth and to reduce erosion). Low-level waste disposal caissons were used in 300 Area Burial Grounds as internment structures for containerized liquid wastes. These caissons, by virtue of their contents, design and methods of closure, require long-term performance evaluation. As an initial activity to evaluate long-term performance, the accurate location of these structures is required. This topical report summarizes engineering activities used to locate caissons in the subsurface environment at the Burial Ground. Activities were conducted to locate caissons during surface stabilization activities. The surface locations were marked, photographed, and recorded on an as built engineering drawing. The recorded location of these caissons will augment long-term observations of confinement structure and engineered surface barrier performance. In addition, accurate caisson location will minimize occupational risk during monitoring and observation activities periodically conducted at the burial ground.« less

  18. Location, Location, Location!

    ERIC Educational Resources Information Center

    Ramsdell, Kristin

    2004-01-01

    Of prime importance in real estate, location is also a key element in the appeal of romances. Popular geographic settings and historical periods sell, unpopular ones do not--not always with a logical explanation, as the author discovered when she conducted a survey on this topic last year. (Why, for example, are the French Revolution and the…

  19. The cause of outliers in electromagnetic pulse (EMP) locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenimore, Edward E.

    2014-10-02

    We present methods to calculate the location of EMP pulses when observed by 5 or more satellites. Simulations show that, even with a good initial guess and fitting a location to all of the data, there are sometime outlier results whose locations are much worse than most cases. By comparing simulations using different ionospheric transfer functions (ITFs), it appears that the outliers are caused by not including the additional path length due to refraction rather than being caused by not including higher order terms in the Appleton-Hartree equation. We suggest ways that the outliers can be corrected. These correction methodsmore » require one to use an electron density profile along the line of sight from the event to the satellite rather than using the total electron content (TEC) to characterize the ionosphere.« less

  20. Influence of ionotropic receptor location on their dynamics at glutamatergic synapses.

    PubMed

    Allam, Sushmita L; Bouteiller, Jean-Marie C; Hu, Eric; Greget, Renaud; Ambert, Nicolas; Bischoff, Serge; Baudry, Michel; Berger, Theodore W

    2012-01-01

    In this paper we study the effects of the location of ionotropic receptors, especially AMPA and NMDA receptors, on their function at excitatory glutamatergic synapses. As few computational models only allow to evaluate the influence of receptor location on state transition and receptor dynamics, we present an elaborate computational model of a glutamatergic synapse that takes into account detailed parametric models of ionotropic receptors along with glutamate diffusion within the synaptic cleft. Our simulation results underscore the importance of the wide spread distribution of AMPA receptors which is required to avoid massive desensitization of these receptors following a single glutamate release event while NMDA receptor location is potentially optimal relative to the glutamate release site thus, emphasizing the contribution of location dependent effects of the two major ionotropic receptors to synaptic efficacy.

  1. Improving prospective memory performance with future event simulation in traumatic brain injury patients.

    PubMed

    Mioni, Giovanna; Bertucci, Erica; Rosato, Antonella; Terrett, Gill; Rendell, Peter G; Zamuner, Massimo; Stablum, Franca

    2017-06-01

    Previous studies have shown that traumatic brain injury (TBI) patients have difficulties with prospective memory (PM). Considering that PM is closely linked to independent living it is of primary interest to develop strategies that can improve PM performance in TBI patients. This study employed Virtual Week task as a measure of PM, and we included future event simulation to boost PM performance. Study 1 evaluated the efficacy of the strategy and investigated possible practice effects. Twenty-four healthy participants performed Virtual Week in a no strategy condition, and 24 healthy participants performed it in a mixed condition (no strategy - future event simulation). In Study 2, 18 TBI patients completed the mixed condition of Virtual Week and were compared with the 24 healthy controls who undertook the mixed condition of Virtual Week in Study 1. All participants also completed a neuropsychological evaluation to characterize the groups on level of cognitive functioning. Study 1 showed that participants in the future event simulation condition outperformed participants in the no strategy condition, and these results were not attributable to practice effects. Results of Study 2 showed that TBI patients performed PM tasks less accurately than controls, but that future event simulation can substantially reduce TBI-related deficits in PM performance. The future event simulation strategy also improved the controls' PM performance. These studies showed the value of future event simulation strategy in improving PM performance in healthy participants as well as in TBI patients. TBI patients performed PM tasks less accurately than controls, confirming prospective memory impairment in these patients. Participants in the future event simulation condition out-performed participants in the no strategy condition. Future event simulation can substantially reduce TBI-related deficits in PM performance. Future event simulation strategy also improved the controls' PM performance.

  2. Performance of a Micro-Strip Gas Chamber for event wise, high rate thermal neutron detection with accurate 2D position determination

    NASA Astrophysics Data System (ADS)

    Mindur, B.; Alimov, S.; Fiutowski, T.; Schulz, C.; Wilpert, T.

    2014-12-01

    A two-dimensional (2D) position sensitive detector for neutron scattering applications based on low-pressure gas amplification and micro-strip technology was built and tested with an innovative readout electronics and data acquisition system. This detector contains a thin solid neutron converter and was developed for time- and thus wavelength-resolved neutron detection in single-event counting mode, which improves the image contrast in comparison with integrating detectors. The prototype detector of a Micro-Strip Gas Chamber (MSGC) was built with a solid natGd/CsI thermal neutron converter for spatial resolutions of about 100 μm and counting rates up to 107 neutrons/s. For attaining very high spatial resolutions and counting rates via micro-strip readout with centre-of-gravity evaluation of the signal amplitude distributions, a fast, channel-wise, self-triggering ASIC was developed. The front-end chips (MSGCROCs), which are very first signal processing components, are read out into powerful ADC-FPGA boards for on-line data processing and thereafter via Gigabit Ethernet link into the data receiving PC. The workstation PC is controlled by a modular, high performance dedicated software suite. Such a fast and accurate system is crucial for efficient radiography/tomography, diffraction or imaging applications based on high flux thermal neutron beam. In this paper a brief description of the detector concept with its operation principles, readout electronics requirements and design together with the signals processing stages performed in hardware and software are presented. In more detail the neutron test beam conditions and measurement results are reported. The focus of this paper is on the system integration, two dimensional spatial resolution, the time resolution of the readout system and the imaging capabilities of the overall setup. The detection efficiency of the detector prototype is estimated as well.

  3. Lightning location system supervising Swedish power transmission network

    NASA Technical Reports Server (NTRS)

    Melin, Stefan A.

    1991-01-01

    For electric utilities, the ability to prevent or minimize lightning damage on personnel and power systems is of great importance. Therefore, the Swedish State Power Board, has been using data since 1983 from a nationwide lightning location system (LLS) for accurately locating lightning ground strikes. Lightning data is distributed and presented on color graphic displays at regional power network control centers as well as at the national power system control center for optimal data use. The main objectives for use of LLS data are: supervising the power system for optimal and safe use of the transmission and generating capacity during periods of thunderstorms; warning service to maintenance and service crews at power line and substations to end operations hazardous when lightning; rapid positioning of emergency crews to locate network damage at areas of detected lightning; and post analysis of power outages and transmission faults in relation to lightning, using archived lightning data for determination of appropriate design and insulation levels of equipment. Staff have found LLS data useful and economically justified since the availability of power system has increased as well as level of personnel safety.

  4. Seismotectonic analysis of the Andaman Sea region from high-precision teleseismic double-difference locations

    NASA Astrophysics Data System (ADS)

    Diehl, T.; Waldhauser, F.; Schaff, D. P.; Engdahl, E. R.

    2009-12-01

    The Andaman Sea region in the Northeast Indian Ocean is characterized by a complex extensional back-arc basin, which connects the Sumatra Fault System in the south with the Sagaing fault in the north. The Andaman back-arc is generally classified as a convergent pull-apart basin (leaky-transform) rather than a typical extensional back-arc basin. Oblique subduction of the Indian-Australian plate results in strike-slip faulting parallel to the trench axis, formation of a sliver plate and back-arc pull-apart extension. Active spreading occurs predominately along a NE-SW oriented ridge-segment bisecting the Central Andaman basin at the SW end of the back-arc. Existing models of the Andaman back-arc system are mainly derived from bathymetry maps, seismic surveys, magnetic anomalies, and seismotectonic analysis. The latter are typically based on global bulletin locations provided by the NEIC or ISC. These bulletin locations, however, usually have low spatial resolution (especially in focal depth), which hampers a detailed seismotectonic interpretation. In order to better study the seismotectonic processes of the Andaman Sea region, specifically its role during the recent 2004 M9.3 earthquake, we improve on existing hypocenter locations by apply the double-difference algorithm to regional and teleseismic data. Differential times used for the relocation process are computed from phase picks listed in the ISC and NEIC bulletins, and from cross-correlating regional and teleseismic waveforms. EHB hypocenter solutions are used as reference locations to improve the initial locations in the ISC/NEIC catalog during double-difference processing. The final DD solutions show significantly reduced scatter in event locations along the back arc ridge. The various observed focal mechanisms tend to cluster by type and, in addition, the structure and orientation of individual clusters are generally consistent with available CMT solutions for individual events and reveal the detailed

  5. Calculating the Motion and Direction of Flux Transfer Events with Cluster

    NASA Technical Reports Server (NTRS)

    Collado-Vega, Y. M.; Sibeck, D. G.

    2012-01-01

    For many years now, the interactions of the solar wind plasma with the Earth's magnetosphere has been one of the most important problems for Space Physics. It is very important that we understand these processes because the high-energy particles and also the solar wind energy that cross the magneto sphere could be responsible for serious damage to our technological systems. The solar wind is inherently a dynamic medium, and the particles interaction with the Earth's magnetosphere can be steady or unsteady. Unsteady interaction include transient processes like bursty magnetic reconnection. Flux Transfer Events (FTEs) are magnetopause signatures that usually occur during transient times of reconnection. They exhibit bipolar signatures in the normal component of the magnetic field. We use multi-point timing analysis to determine the orientation and motion of ux transfer events (FTEs) detected by the four Cluster spacecraft on the high-latitude dayside and flank magnetopause during 2002 and 2003. During these years, the distances between the Cluster spacecraft were greater than 1000 km, providing the tetrahedral configuration needed to select events and determine velocities. Each velocity and location will be examined in detail and compared to the velocities and locations determined by the predictions of the component and antiparallel reconnection models for event formation, orientation, motion, and acceleration for a wide range of spacecraft locations and solar wind conditions.

  6. Refinements to the method of epicentral location based on surface waves from ambient seismic noise: introducing Love waves

    USGS Publications Warehouse

    Levshin, Anatoli L.; Barmin, Mikhail P.; Moschetti, Morgan P.; Mendoza, Carlos; Ritzwoller, Michael H.

    2012-01-01

    The purpose of this study is to develop and test a modification to a previous method of regional seismic event location based on Empirical Green’s Functions (EGFs) produced from ambient seismic noise. Elastic EGFs between pairs of seismic stations are determined by cross-correlating long ambient noise time-series recorded at the two stations. The EGFs principally contain Rayleigh- and Love-wave energy on the vertical and transverse components, respectively, and we utilize these signals between about 5 and 12 s period. The previous method, based exclusively on Rayleigh waves, may yield biased epicentral locations for certain event types with hypocentral depths between 2 and 5 km. Here we present theoretical arguments that show how Love waves can be introduced to reduce or potentially eliminate the bias. We also present applications of Rayleigh- and Love-wave EGFs to locate 10 reference events in the western United States. The separate Rayleigh and Love epicentral locations and the joint locations using a combination of the two waves agree to within 1 km distance, on average, but confidence ellipses are smallest when both types of waves are used.

  7. Destructive Single-Event Effects in Diodes

    NASA Technical Reports Server (NTRS)

    Casey, Megan C.; Lauenstein, Jean-Marie; Campola, Michael J.; Wilcox, Edward P.; Phan, Anthony M.; Label, Kenneth A.

    2017-01-01

    In this work, we discuss the observed single-event effects in a variety of types of diodes. In addition, we conduct failure analysis on several Schottky diodes that were heavy-ion irradiated. High- and low-magnitude optical microscope images, infrared camera images, and scanning electron microscope images are used to identify and describe the failure locations.

  8. Time Accurate Unsteady Pressure Loads Simulated for the Space Launch System at a Wind Tunnel Condition

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, Bil; Streett, Craig L; Glass, Christopher E.; Schuster, David M.

    2015-01-01

    Using the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics code, an unsteady, time-accurate flow field about a Space Launch System configuration was simulated at a transonic wind tunnel condition (Mach = 0.9). Delayed detached eddy simulation combined with Reynolds Averaged Naiver-Stokes and a Spallart-Almaras turbulence model were employed for the simulation. Second order accurate time evolution scheme was used to simulate the flow field, with a minimum of 0.2 seconds of simulated time to as much as 1.4 seconds. Data was collected at 480 pressure taps at locations, 139 of which matched a 3% wind tunnel model, tested in the Transonic Dynamic Tunnel (TDT) facility at NASA Langley Research Center. Comparisons between computation and experiment showed agreement within 5% in terms of location for peak RMS levels, and 20% for frequency and magnitude of power spectral densities. Grid resolution and time step sensitivity studies were performed to identify methods for improved accuracy comparisons to wind tunnel data. With limited computational resources, accurate trends for reduced vibratory loads on the vehicle were observed. Exploratory methods such as determining minimized computed errors based on CFL number and sub-iterations, as well as evaluating frequency content of the unsteady pressures and evaluation of oscillatory shock structures were used in this study to enhance computational efficiency and solution accuracy. These techniques enabled development of a set of best practices, for the evaluation of future flight vehicle designs in terms of vibratory loads.

  9. List-mode reconstruction for the Biograph mCT with physics modeling and event-by-event motion correction

    NASA Astrophysics Data System (ADS)

    Jin, Xiao; Chan, Chung; Mulnix, Tim; Panin, Vladimir; Casey, Michael E.; Liu, Chi; Carson, Richard E.

    2013-08-01

    Whole-body PET/CT scanners are important clinical and research tools to study tracer distribution throughout the body. In whole-body studies, respiratory motion results in image artifacts. We have previously demonstrated for brain imaging that, when provided with accurate motion data, event-by-event correction has better accuracy than frame-based methods. Therefore, the goal of this work was to develop a list-mode reconstruction with novel physics modeling for the Siemens Biograph mCT with event-by-event motion correction, based on the MOLAR platform (Motion-compensation OSEM List-mode Algorithm for Resolution-Recovery Reconstruction). Application of MOLAR for the mCT required two algorithmic developments. First, in routine studies, the mCT collects list-mode data in 32 bit packets, where averaging of lines-of-response (LORs) by axial span and angular mashing reduced the number of LORs so that 32 bits are sufficient to address all sinogram bins. This degrades spatial resolution. In this work, we proposed a probabilistic LOR (pLOR) position technique that addresses axial and transaxial LOR grouping in 32 bit data. Second, two simplified approaches for 3D time-of-flight (TOF) scatter estimation were developed to accelerate the computationally intensive calculation without compromising accuracy. The proposed list-mode reconstruction algorithm was compared to the manufacturer's point spread function + TOF (PSF+TOF) algorithm. Phantom, animal, and human studies demonstrated that MOLAR with pLOR gives slightly faster contrast recovery than the PSF+TOF algorithm that uses the average 32 bit LOR sinogram positioning. Moving phantom and a whole-body human study suggested that event-by-event motion correction reduces image blurring caused by respiratory motion. We conclude that list-mode reconstruction with pLOR positioning provides a platform to generate high quality images for the mCT, and to recover fine structures in whole-body PET scans through event-by-event motion

  10. Full On-Device Stay Points Detection in Smartphones for Location-Based Mobile Applications.

    PubMed

    Pérez-Torres, Rafael; Torres-Huitzil, César; Galeana-Zapién, Hiram

    2016-10-13

    The tracking of frequently visited places, also known as stay points, is a critical feature in location-aware mobile applications as a way to adapt the information and services provided to smartphones users according to their moving patterns. Location based applications usually employ the GPS receiver along with Wi-Fi hot-spots and cellular cell tower mechanisms for estimating user location. Typically, fine-grained GPS location data are collected by the smartphone and transferred to dedicated servers for trajectory analysis and stay points detection. Such Mobile Cloud Computing approach has been successfully employed for extending smartphone's battery lifetime by exchanging computation costs, assuming that on-device stay points detection is prohibitive. In this article, we propose and validate the feasibility of having an alternative event-driven mechanism for stay points detection that is executed fully on-device, and that provides higher energy savings by avoiding communication costs. Our solution is encapsulated in a sensing middleware for Android smartphones, where a stream of GPS location updates is collected in the background, supporting duty cycling schemes, and incrementally analyzed following an event-driven paradigm for stay points detection. To evaluate the performance of the proposed middleware, real world experiments were conducted under different stress levels, validating its power efficiency when compared against a Mobile Cloud Computing oriented solution.

  11. Full On-Device Stay Points Detection in Smartphones for Location-Based Mobile Applications

    PubMed Central

    Pérez-Torres, Rafael; Torres-Huitzil, César; Galeana-Zapién, Hiram

    2016-01-01

    The tracking of frequently visited places, also known as stay points, is a critical feature in location-aware mobile applications as a way to adapt the information and services provided to smartphones users according to their moving patterns. Location based applications usually employ the GPS receiver along with Wi-Fi hot-spots and cellular cell tower mechanisms for estimating user location. Typically, fine-grained GPS location data are collected by the smartphone and transferred to dedicated servers for trajectory analysis and stay points detection. Such Mobile Cloud Computing approach has been successfully employed for extending smartphone’s battery lifetime by exchanging computation costs, assuming that on-device stay points detection is prohibitive. In this article, we propose and validate the feasibility of having an alternative event-driven mechanism for stay points detection that is executed fully on-device, and that provides higher energy savings by avoiding communication costs. Our solution is encapsulated in a sensing middleware for Android smartphones, where a stream of GPS location updates is collected in the background, supporting duty cycling schemes, and incrementally analyzed following an event-driven paradigm for stay points detection. To evaluate the performance of the proposed middleware, real world experiments were conducted under different stress levels, validating its power efficiency when compared against a Mobile Cloud Computing oriented solution. PMID:27754388

  12. Prediction of Exposure Level of Energetic Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, M. H. Y.; Blattnig, S.

    2016-12-01

    The potential for exposure to large solar particle events (SPEs) with fluxes that extend to high energies is a major concern during interplanetary transfer and extravehicular activities (EVAs) on the lunar and Martian surfaces. Prediction of sporadic occurrence of SPEs is not accurate for near or long-term scales, while the expected frequency of such events is strongly influenced by solar cycle activity. In the development of NASA's operational strategies real-time estimation of exposure to SPEs has been considered so that adequate responses can be applied in a timely manner to reduce exposures to well below the exposure limits. Previously, the organ doses of large historical SPEs had been calculated by using the complete energy spectra of each event and then developing a prediction model for blood-forming organ (BFO) dose based solely on an assumed value of integrated fluence above 30 MeV (Φ30) for an otherwise unspecified future SPE. While BFO dose is determined primarily by solar protons with high energies, it was reasoned that more accurate BFO dose prediction models could be developed using integrated fluence above 60 MeV (Φ60) and above 100 MeV (Φ100) as predictors instead of Φ30. In the current study, re-analysis of major SPEs (in which the proton spectra of the ground level enhancement [GLE] events since 1956 are correctly described by Band functions) has been used in evaluation of exposure levels. More accurate prediction models for BFO dose and NASA effective dose are then developed using integrated fluence above 200 MeV (Φ200), which by far have the most weight in the calculation of doses for deep-seated organs from exposure to extreme SPEs (GLEs or sub-GLEs). The unconditional probability of a BFO dose exceeding a pre-specified BFO dose limit is simultaneously calculated by taking into account the distribution of the predictor (Φ30, Φ60, Φ100, or Φ200) as estimated from historical SPEs. These results can be applied to the development of

  13. Visual tracking using neuromorphic asynchronous event-based cameras.

    PubMed

    Ni, Zhenjiang; Ieng, Sio-Hoi; Posch, Christoph; Régnier, Stéphane; Benosman, Ryad

    2015-04-01

    This letter presents a novel computationally efficient and robust pattern tracking method based on a time-encoded, frame-free visual data. Recent interdisciplinary developments, combining inputs from engineering and biology, have yielded a novel type of camera that encodes visual information into a continuous stream of asynchronous, temporal events. These events encode temporal contrast and intensity locally in space and time. We show that the sparse yet accurately timed information is well suited as a computational input for object tracking. In this letter, visual data processing is performed for each incoming event at the time it arrives. The method provides a continuous and iterative estimation of the geometric transformation between the model and the events representing the tracked object. It can handle isometry, similarities, and affine distortions and allows for unprecedented real-time performance at equivalent frame rates in the kilohertz range on a standard PC. Furthermore, by using the dimension of time that is currently underexploited by most artificial vision systems, the method we present is able to solve ambiguous cases of object occlusions that classical frame-based techniques handle poorly.

  14. Defining Extreme Events: A Cross-Disciplinary Review

    NASA Astrophysics Data System (ADS)

    McPhillips, Lauren E.; Chang, Heejun; Chester, Mikhail V.; Depietri, Yaella; Friedman, Erin; Grimm, Nancy B.; Kominoski, John S.; McPhearson, Timon; Méndez-Lázaro, Pablo; Rosi, Emma J.; Shafiei Shiva, Javad

    2018-03-01

    Extreme events are of interest worldwide given their potential for substantial impacts on social, ecological, and technical systems. Many climate-related extreme events are increasing in frequency and/or magnitude due to anthropogenic climate change, and there is increased potential for impacts due to the location of urbanization and the expansion of urban centers and infrastructures. Many disciplines are engaged in research and management of these events. However, a lack of coherence exists in what constitutes and defines an extreme event across these fields, which impedes our ability to holistically understand and manage these events. Here, we review 10 years of academic literature and use text analysis to elucidate how six major disciplines—climatology, earth sciences, ecology, engineering, hydrology, and social sciences—define and communicate extreme events. Our results highlight critical disciplinary differences in the language used to communicate extreme events. Additionally, we found a wide range in definitions and thresholds, with more than half of examined papers not providing an explicit definition, and disagreement over whether impacts are included in the definition. We urge distinction between extreme events and their impacts, so that we can better assess when responses to extreme events have actually enhanced resilience. Additionally, we suggest that all researchers and managers of extreme events be more explicit in their definition of such events as well as be more cognizant of how they are communicating extreme events. We believe clearer and more consistent definitions and communication can support transdisciplinary understanding and management of extreme events.

  15. Effects of event valence on long-term memory for two baseball championship games.

    PubMed

    Breslin, Carolyn W; Safer, Martin A

    2011-11-01

    We investigated how event valence affected accuracy and vividness of long-term memory for two comparable public events. In 2008, 1,563 fans answered questions about objective details concerning two decisive baseball championship games between the Yankees (2003 winners) and the Red Sox (2004 winners). Both between- and within-groups analyses indicated that fans remembered the game their team won significantly more accurately than the game their team lost. Fans also reported more vividness and more rehearsal for the game their team won. We conclude that individuals rehearse positive events more than comparable negative events, and that this additional rehearsal increases both vividness and accuracy of memories about positive events. Our results differ from those of prior studies involving memories for negative events that may have been unavoidably rehearsed; such rehearsal may have kept those memories from fading. Long-term memory for an event is determined not only by the valence of the event, but also by experiences after the event.

  16. Transfer of location-specific control to untrained locations.

    PubMed

    Weidler, Blaire J; Bugg, Julie M

    2016-11-01

    Recent research highlights a seemingly flexible and automatic form of cognitive control that is triggered by potent contextual cues, as exemplified by the location-specific proportion congruence effect--reduced compatibility effects in locations associated with a high as compared to low likelihood of conflict. We investigated just how flexible location-specific control is by examining whether novel locations effectively cue control for congruency-unbiased stimuli. In two experiments, biased (mostly compatible or mostly incompatible) training stimuli appeared in distinct locations. During a final block, unbiased (50% compatible) stimuli appeared in novel untrained locations spatially linked to biased locations. The flanker compatibly effect was reduced for unbiased stimuli in novel locations linked to a mostly incompatible compared to a mostly compatible location, indicating transfer. Transfer was observed when stimuli appeared along a linear function (Experiment 1) or in rings of a bullseye (Experiment 2). The novel transfer effects imply that location-specific control is more flexible than previously reported and further counter the complex stimulus-response learning account of location-specific proportion congruence effects. We propose that the representation and retrieval of control settings in untrained locations may depend on environmental support and the presentation of stimuli in novel locations that fall within the same categories of space as trained locations.

  17. Estimation of distributed Fermat-point location for wireless sensor networking.

    PubMed

    Huang, Po-Hsian; Chen, Jiann-Liang; Larosa, Yanuarius Teofilus; Chiang, Tsui-Lien

    2011-01-01

    This work presents a localization scheme for use in wireless sensor networks (WSNs) that is based on a proposed connectivity-based RF localization strategy called the distributed Fermat-point location estimation algorithm (DFPLE). DFPLE applies triangle area of location estimation formed by intersections of three neighboring beacon nodes. The Fermat point is determined as the shortest path from three vertices of the triangle. The area of estimated location then refined using Fermat point to achieve minimum error in estimating sensor nodes location. DFPLE solves problems of large errors and poor performance encountered by localization schemes that are based on a bounding box algorithm. Performance analysis of a 200-node development environment reveals that, when the number of sensor nodes is below 150, the mean error decreases rapidly as the node density increases, and when the number of sensor nodes exceeds 170, the mean error remains below 1% as the node density increases. Second, when the number of beacon nodes is less than 60, normal nodes lack sufficient beacon nodes to enable their locations to be estimated. However, the mean error changes slightly as the number of beacon nodes increases above 60. Simulation results revealed that the proposed algorithm for estimating sensor positions is more accurate than existing algorithms, and improves upon conventional bounding box strategies.

  18. Observation of Long Ionospheric Recoveries from Lightning-induced Electron Precipitation Events

    NASA Astrophysics Data System (ADS)

    Mohammadpour Salut, M.; Cohen, M.

    2015-12-01

    Lightning strokes induces lower ionospheric nighttime disturbances which can be detected through Very Low Frequency (VLF) remote sensing via at least two means: (1) direct heating and ionization, known as an Early event, and (2) triggered precipitation of energetic electrons from the radiation belts, known as Lightning-induced Electron Precipitation (LEP). For each, the ionospheric recover time is typically a few minutes or less. A small class of Early events have been identified as having unusually long ionospheric recoveries (10s of minutes), with the underlying mechanism still in question. Our study shows for the first time that some LEP events also demonstrate unusually long recovery. The VLF events were detected by visual inspection of the recorded data in both the North-South and East-West magnetic fields. Data from the National Lightning Detection Network (NLDN) are used to determine the location and peak current of the lightning responsible for each lightning-associated VLF perturbation. LEP or Early VLF events are determined by measuring the time delay between the causative lightning discharges and the onset of all lightning-associated perturbations. LEP events typically possess an onset delay greater than ~ 200 msec following the causative lightning discharges, while the onset of Early VLF events is time-aligned (<20 msec) with the lightning return stroke. Nonducted LEP events are distinguished from ducted events based on the location of the causative lightning relative to the precipitation region. From 15 March to 20 April and 15 October to 15 November 2011, a total of 385 LEP events observed at Indiana, Montana, Colorado and Oklahoma VLF sites, on the NAA, NLK and NML transmitter signals. 46 of these events exhibited a long recovery. It has been found that the occurrence rate of ducted long recovery LEP events is higher than nonducted. Of the 46 long recovery LEP events, 33 events were induced by ducted whistlers, and 13 events were associated with

  19. Lost in translation: preclinical studies on 3,4-methylenedioxymethamphetamine provide information on mechanisms of action, but do not allow accurate prediction of adverse events in humans

    PubMed Central

    Green, AR; King, MV; Shortall, SE; Fone, KCF

    2012-01-01

    3,4-Methylenedioxymethamphetamine (MDMA) induces both acute adverse effects and long-term neurotoxic loss of brain 5-HT neurones in laboratory animals. However, when choosing doses, most preclinical studies have paid little attention to the pharmacokinetics of the drug in humans or animals. The recreational use of MDMA and current clinical investigations of the drug for therapeutic purposes demand better translational pharmacology to allow accurate risk assessment of its ability to induce adverse events. Recent pharmacokinetic studies on MDMA in animals and humans are reviewed and indicate that the risks following MDMA ingestion should be re-evaluated. Acute behavioural and body temperature changes result from rapid MDMA-induced monoamine release, whereas long-term neurotoxicity is primarily caused by metabolites of the drug. Therefore acute physiological changes in humans are fairly accurately mimicked in animals by appropriate dosing, although allometric dosing calculations have little value. Long-term changes require MDMA to be metabolized in a similar manner in experimental animals and humans. However, the rate of metabolism of MDMA and its major metabolites is slower in humans than rats or monkeys, potentially allowing endogenous neuroprotective mechanisms to function in a species specific manner. Furthermore acute hyperthermia in humans probably limits the chance of recreational users ingesting sufficient MDMA to produce neurotoxicity, unlike in the rat. MDMA also inhibits the major enzyme responsible for its metabolism in humans thereby also assisting in preventing neurotoxicity. These observations question whether MDMA alone produces long-term 5-HT neurotoxicity in human brain, although when taken in combination with other recreational drugs it may induce neurotoxicity. LINKED ARTICLES This article is commented on by Parrott, pp. 1518–1520 of this issue. To view this commentary visit http://dx.doi.org/10.1111/j.1476-5381.2012.01941.x and to view the the

  20. An emergency medical planning guide for commercial spaceflight events.

    PubMed

    Law, Jennifer; Vanderploeg, James

    2012-09-01

    Commercial spaceflight events transporting paying passengers into space will begin to take place at various spaceports around the country within the next few years. Many spaceports are located in remote areas that are far from major hospitals and trauma centers. Spaceport medical directors should develop emergency medical plans (EMPs) to prepare for potential medical contingencies that may occur during commercial spaceflight events. The aim of this article is to guide spaceport medical directors in emergency medical planning for commercial spaceflight events. This guide is based on our experience and a recently developed EMP for Spaceport America which incorporated a literature review of mass gathering medicine, existing planning guides for mass gathering events, and EMPs for analogous aerospace events. We propose a multipronged approach to emergency medical planning, consisting of event planning, medical reconnaissance, medical personnel, protocols, physical facility and hardware, and documentation. Medical directors should use this guide to develop an emergency medical plan tailored to the resources and constraints specific to their events.