Sample records for earthquake numbers consequence

  1. Statistical distributions of earthquake numbers: consequence of branching process

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2010-03-01

    We discuss various statistical distributions of earthquake numbers. Previously, we derived several discrete distributions to describe earthquake numbers for the branching model of earthquake occurrence: these distributions are the Poisson, geometric, logarithmic and the negative binomial (NBD). The theoretical model is the `birth and immigration' population process. The first three distributions above can be considered special cases of the NBD. In particular, a point branching process along the magnitude (or log seismic moment) axis with independent events (immigrants) explains the magnitude/moment-frequency relation and the NBD of earthquake counts in large time/space windows, as well as the dependence of the NBD parameters on the magnitude threshold (magnitude of an earthquake catalogue completeness). We discuss applying these distributions, especially the NBD, to approximate event numbers in earthquake catalogues. There are many different representations of the NBD. Most can be traced either to the Pascal distribution or to the mixture of the Poisson distribution with the gamma law. We discuss advantages and drawbacks of both representations for statistical analysis of earthquake catalogues. We also consider applying the NBD to earthquake forecasts and describe the limits of the application for the given equations. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrence, the NBD has two parameters. The second parameter can be used to characterize clustering or overdispersion of a process. We determine the parameter values and their uncertainties for several local and global catalogues, and their subdivisions in various time intervals, magnitude thresholds, spatial windows, and tectonic categories. The theoretical model of how the clustering parameter depends on the corner (maximum) magnitude can be used to predict future earthquake number distribution in regions where very large earthquakes have not yet occurred.

  2. Earthquake number forecasts testing

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  3. Extreme Magnitude Earthquakes and their Economical Consequences

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Perea, N.; Emerson, D.; Salazar, A.; Moulinec, C.

    2011-12-01

    The frequency of occurrence of extreme magnitude earthquakes varies from tens to thousands of years, depending on the considered seismotectonic region of the world. However, the human and economic losses when their hypocenters are located in the neighborhood of heavily populated and/or industrialized regions, can be very large, as recently observed for the 1985 Mw 8.01 Michoacan, Mexico and the 2011 Mw 9 Tohoku, Japan, earthquakes. Herewith, a methodology is proposed in order to estimate the probability of exceedance of: the intensities of extreme magnitude earthquakes, PEI and of their direct economical consequences PEDEC. The PEI are obtained by using supercomputing facilities to generate samples of the 3D propagation of extreme earthquake plausible scenarios, and enlarge those samples by Monte Carlo simulation. The PEDEC are computed by using appropriate vulnerability functions combined with the scenario intensity samples, and Monte Carlo simulation. An example of the application of the methodology due to the potential occurrence of extreme Mw 8.5 subduction earthquakes on Mexico City is presented.

  4. Rapid estimation of the economic consequences of global earthquakes

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  5. Economic consequences of earthquakes: bridging research and practice with HayWired

    NASA Astrophysics Data System (ADS)

    Wein, A. M.; Kroll, C.

    2016-12-01

    The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

  6. Knowledge base about earthquakes as a tool to minimize strong events consequences

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  7. The Long-Run Socio-Economic Consequences of a Large Disaster: The 1995 Earthquake in Kobe.

    PubMed

    duPont, William; Noy, Ilan; Okuyama, Yoko; Sawada, Yasuyuki

    2015-01-01

    We quantify the 'permanent' socio-economic impacts of the Great Hanshin-Awaji (Kobe) earthquake in 1995 by employing a large-scale panel dataset of 1,719 cities, towns, and wards from Japan over three decades. In order to estimate the counterfactual--i.e., the Kobe economy without the earthquake--we use the synthetic control method. Three important empirical patterns emerge: First, the population size and especially the average income level in Kobe have been lower than the counterfactual level without the earthquake for over fifteen years, indicating a permanent negative effect of the earthquake. Such a negative impact can be found especially in the central areas which are closer to the epicenter. Second, the surrounding areas experienced some positive permanent impacts in spite of short-run negative effects of the earthquake. Much of this is associated with movement of people to East Kobe, and consequent movement of jobs to the metropolitan center of Osaka, that is located immediately to the East of Kobe. Third, the furthest areas in the vicinity of Kobe seem to have been insulated from the large direct and indirect impacts of the earthquake.

  8. The Long-Run Socio-Economic Consequences of a Large Disaster: The 1995 Earthquake in Kobe

    PubMed Central

    duPont IV, William; Noy, Ilan; Okuyama, Yoko; Sawada, Yasuyuki

    2015-01-01

    We quantify the ‘permanent’ socio-economic impacts of the Great Hanshin-Awaji (Kobe) earthquake in 1995 by employing a large-scale panel dataset of 1,719 cities, towns, and wards from Japan over three decades. In order to estimate the counterfactual—i.e., the Kobe economy without the earthquake—we use the synthetic control method. Three important empirical patterns emerge: First, the population size and especially the average income level in Kobe have been lower than the counterfactual level without the earthquake for over fifteen years, indicating a permanent negative effect of the earthquake. Such a negative impact can be found especially in the central areas which are closer to the epicenter. Second, the surrounding areas experienced some positive permanent impacts in spite of short-run negative effects of the earthquake. Much of this is associated with movement of people to East Kobe, and consequent movement of jobs to the metropolitan center of Osaka, that is located immediately to the East of Kobe. Third, the furthest areas in the vicinity of Kobe seem to have been insulated from the large direct and indirect impacts of the earthquake. PMID:26426998

  9. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  10. What caused a large number of fatalities in the Tohoku earthquake?

    NASA Astrophysics Data System (ADS)

    Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.

    2012-04-01

    The Mw9.0 earthquake caused 20,000 deaths and missing persons in northeastern Japan. 115 years prior to this event, there were three historical tsunamis that struck the region, one of which is a "tsunami earthquake" resulted with a death toll of 22,000. Since then, numerous breakwaters were constructed along the entire northeastern coasts and tsunami evacuation drills were carried out and hazard maps were distributed to local residents on numerous communities. However, despite the constructions and preparedness efforts, the March 11 Tohoku earthquake caused numerous fatalities. The strong shaking lasted three minutes or longer, thus all residents recognized that this is the strongest and longest earthquake that they had been ever experienced in their lives. The tsunami inundated an enormous area at about 560km2 over 35 cities along the coast of northeast Japan. To find out the reasons behind the high number of fatalities due to the March 11 tsunami, we interviewed 150 tsunami survivors at public evacuation shelters in 7 cities mainly in Iwate prefecture in mid-April and early June 2011. Interviews were done for about 30min or longer focused on their evacuation behaviors and those that they had observed. On the basis of the interviews, we found that residents' decisions not to evacuate immediately were partly due to or influenced by earthquake science results. Below are some of the factors that affected residents' decisions. 1. Earthquake hazard assessments turned out to be incorrect. Expected earthquake magnitudes and resultant hazards in northeastern Japan assessed and publicized by the government were significantly smaller than the actual Tohoku earthquake. 2. Many residents did not receive accurate tsunami warnings. The first tsunami warning were too small compared with the actual tsunami heights. 3. The previous frequent warnings with overestimated tsunami height influenced the behavior of the residents. 4. Many local residents above 55 years old experienced

  11. [Earthquakes--a historical review, environmental and health effects, and health care measures].

    PubMed

    Nola, Iskra Alexandra; Doko Jelinić, Jagoda; Žuškin, Eugenija; Kratohvil, Mladen

    2013-06-01

    Earthquakes are natural disasters that can occur at any time, regardless of the location. Their frequency is higher in the Circum-Pacific and Mediterranean/Trans-Asian seismic belt. A number of sophisticated methods define their magnitude using the Richter scale and intensity using the Mercani-Cancani-Sieberg scale. Recorded data show a number of devastating earthquakes that have killed many people and changed the environment dramatically. Croatia is located in a seismically active area, which has endured a series of historical earthquakes, among which several occurred in the Zagreb area. The consequences of an earthquake depend mostly on the population density and seismic resistance of buildings in the affected area. Environmental consequences often include air, water, and soil pollution. The effects of this kind of pollution can have long-term health effects. The most dramatic health consequences result from the demolition of buildings. Therefore, quick and efficient aid depends on well-organized health professionals as well as on the readiness of the civil defence, fire department, and Mountain Rescue Service members. Good coordination among these services can save many lives Public health interventions must include effective control measures in the environment as secondary prevention methods for health problems caused by unfavourable environmental factors. The identification and control of long-term hazards can reduce chronic health effects. The reduction of earthquake-induced damages includes setting priorities in building seismically safe buildings.

  12. Loss Estimations due to Earthquakes and Secondary Technological Hazards

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2009-04-01

    Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.

  13. Earthquakes; January-February 1982

    USGS Publications Warehouse

    Person, W.J.

    1982-01-01

    In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine. 

  14. Organizational changes at Earthquakes & Volcanoes

    USGS Publications Warehouse

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  15. PAGER-CAT: A composite earthquake catalog for calibrating global fatality models

    USGS Publications Warehouse

    Allen, T.I.; Marano, K.D.; Earle, P.S.; Wald, D.J.

    2009-01-01

    highly uncertain, particularly the casualty numbers, which must be regarded as estimates rather than firm numbers for many earthquakes. Consequently, we encourage contributions from the seismology and earthquake engineering communities to further improve this resource via the Wikipedia page and personal communications, for the benefit of the whole community.

  16. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  17. Earthquakes; January-February, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The first major earthquake (magnitude 7.0 to 7.9) of the year struck in southeastern Alaska in a sparsely populated area on February 28. On January 16, Iran experienced the first destructive earthquake of the year causing a number of casualties and considerable damage. Peru was hit by a destructive earthquake on February 16 that left casualties and damage. A number of earthquakes were experienced in parts of the Untied States, but only minor damage was reported. 

  18. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  19. Earthquakes, November-December 1973

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria. 

  20. Analog earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

  1. Earthquake Loss Scenarios in the Himalayas

    NASA Astrophysics Data System (ADS)

    Wyss, M.; Gupta, S.; Rosset, P.; Chamlagain, D.

    2017-12-01

    We estimate quantitatively that in repeats of the 1555 and 1505 great Himalayan earthquakes the fatalities may range from 51K to 549K, the injured from 157K to 1,700K and the strongly affected population (Intensity≥VI) from 15 to 75 million, depending on the details of the assumed earthquake parameters. For up-dip ruptures in the stressed segments of the M7.8 Gorkha 2015, the M7.9 Subansiri 1947 and the M7.8 Kangra 1905 earthquakes, we estimate 62K, 100K and 200K fatalities, respectively. The numbers of strongly affected people we estimate as 8, 12, 33 million, in these cases respectively. These loss calculations are based on verifications of the QLARM algorithms and data set in the cases of the M7.8 Gorkha 2015, the M7.8 Kashmir 2005, the M6.6 Chamoli 1999, the M6.8 Uttarkashi 1991 and the M7.8 Kangra 1905 earthquakes. The requirement of verification that was fulfilled in these test cases was that the reported intensity field and the fatality count had to match approximately, using the known parameters of the earthquakes. The apparent attenuation factor was a free parameter and ranged within acceptable values. Numbers for population were adjusted for the years in question from the latest census. The hour of day was assumed to be at night with maximum occupation. The assumption that the upper half of the Main Frontal Thrust (MFT) will rupture in companion earthquakes to historic earthquakes in the down-dip half is based on the observations of several meters of displacement in trenches across the MFT outcrop. Among mitigation measures awareness with training and adherence to construction codes rank highest. Retrofitting of schools and hospitals would save lives and prevent injuries. Preparation plans for helping millions of strongly affected people should be put in place. These mitigation efforts should focus on an approximately 7 km wide strip along the MFT on the up-thrown side because the strong motions are likely to be doubled. We emphasize that our estimates

  2. Economic Effects of 1978 Tabas Earthquake (Iran).

    PubMed

    Zandian, Elham; Rimaz, Shahnaz; Holakouie Naieni, Kourosh; Nedjat, Saharnaz; Naderimagham, Shohreh; Larijani, Bagher; Farzadfar, Farshad

    2016-06-01

    Natural disasters are one of the most important adverse health events. The earthquake that happened in the city of Tabas in 1978 was ranked third in terms of number of deaths caused by natural disasters over the past 100 years in Iran. This study was aimed to evaluate the economic and human capital consequences of earthquake in Tabas district. We used a two percent random sample of Iran Census Dataset from 2006 to run a difference-in-difference study. The difference-in-difference methodology was used to evaluate (1) the mean changes in variables including years of schooling and wealth; (2) the odds changes in primary school completion and literacy of people born (5 or 10 years) post-event versus (5 or 10 years) pre-event in Tabas compared with the same values for those born in the same period of time in the control districts. Differential increase in years of schooling for being born 10 years after the earthquake versus in 10 years before earthquake in Tabas was one-third of a school year less than in the control districts. There were 89.5% and 65.4% decrease in odds that an individual is literate, and 0.26 and 0.104 average decrease in the SES index for those born in Tabas in periods of 5 and 10 years, respectively, compared with control districts. Tabas earthquake had negative long-term effects on human capital and wealth. This study can help official authorities to promote educational and economic plans and to implement comprehensive reforms in earthquake-stricken areas.

  3. A classifying method analysis on the number of returns for given pulse of post-earthquake airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Wang, Jinxia; Dou, Aixia; Wang, Xiaoqing; Huang, Shusong; Yuan, Xiaoxiang

    2016-11-01

    Compared to remote sensing image, post-earthquake airborne Light Detection And Ranging (LiDAR) point cloud data contains a high-precision three-dimensional information on earthquake disaster which can improve the accuracy of the identification of destroy buildings. However after the earthquake, the damaged buildings showed so many different characteristics that we can't distinguish currently between trees and damaged buildings points by the most commonly used method of pre-processing. In this study, we analyse the number of returns for given pulse of trees and damaged buildings point cloud and explore methods to distinguish currently between trees and damaged buildings points. We propose a new method by searching for a certain number of neighbourhood space and calculate the ratio(R) of points whose number of returns for given pulse greater than 1 of the neighbourhood points to separate trees from buildings. In this study, we select some point clouds of typical undamaged building, collapsed building and tree as samples from airborne LiDAR point cloud data which got after 2010 earthquake in Haiti MW7.0 by the way of human-computer interaction. Testing to get the Rvalue to distinguish between trees and buildings and apply the R-value to test testing areas. The experiment results show that the proposed method in this study can distinguish between building (undamaged and damaged building) points and tree points effectively but be limited in area where buildings various, damaged complex and trees dense, so this method will be improved necessarily.

  4. Insignificant solar-terrestrial triggering of earthquakes

    USGS Publications Warehouse

    Love, Jeffrey J.; Thomas, Jeremy N.

    2013-01-01

    We examine the claim that solar-terrestrial interaction, as measured by sunspots, solar wind velocity, and geomagnetic activity, might play a role in triggering earthquakes. We count the number of earthquakes having magnitudes that exceed chosen thresholds in calendar years, months, and days, and we order these counts by the corresponding rank of annual, monthly, and daily averages of the solar-terrestrial variables. We measure the statistical significance of the difference between the earthquake-number distributions below and above the median of the solar-terrestrial averages by χ2 and Student's t tests. Across a range of earthquake magnitude thresholds, we find no consistent and statistically significant distributional differences. We also introduce time lags between the solar-terrestrial variables and the number of earthquakes, but again no statistically significant distributional difference is found. We cannot reject the null hypothesis of no solar-terrestrial triggering of earthquakes.

  5. Earthquakes, November-December 1992

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California. 

  6. Earthquakes, May-June 1981

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    The months of May and June were somewhat quiet, seismically speaking. There was one major earthquake (7.0-7.9) off the west coast of South Island, New Zealand. The most destructive earthquake during this reporting period was in southern Iran on June 11 which caused fatalities and extensive damage. Peru also experienced a destructive earthquake on June 22 which caused fatalities and damage. In the United States, a number of earthquakes were experienced, but none caused significant damage. 

  7. Spatio-temporal Variations of Characteristic Repeating Earthquake Sequences along the Middle America Trench in Mexico

    NASA Astrophysics Data System (ADS)

    Dominguez, L. A.; Taira, T.; Hjorleifsdottir, V.; Santoyo, M. A.

    2015-12-01

    Repeating earthquake sequences are sets of events that are thought to rupture the same area on the plate interface and thus provide nearly identical waveforms. We systematically analyzed seismic records from 2001 through 2014 to identify repeating earthquakes with highly correlated waveforms occurring along the subduction zone of the Cocos plate. Using the correlation coefficient (cc) and spectral coherency (coh) of the vertical components as selection criteria, we found a set of 214 sequences whose waveforms exceed cc≥95% and coh≥95%. Spatial clustering along the trench shows large variations in repeating earthquakes activity. Particularly, the rupture zone of the M8.1, 1985 earthquake shows an almost absence of characteristic repeating earthquakes, whereas the Guerrero Gap zone and the segment of the trench close to the Guerrero-Oaxaca border shows a significantly larger number of repeating earthquakes sequences. Furthermore, temporal variations associated to stress changes due to major shows episodes of unlocking and healing of the interface. Understanding the different components that control the location and recurrence time of characteristic repeating sequences is a key factor to pinpoint areas where large megathrust earthquakes may nucleate and consequently to improve the seismic hazard assessment.

  8. Prediction of earthquake-triggered landslide event sizes

    NASA Astrophysics Data System (ADS)

    Braun, Anika; Havenith, Hans-Balder; Schlögel, Romy

    2016-04-01

    Seismically induced landslides are a major environmental effect of earthquakes, which may significantly contribute to related losses. Moreover, in paleoseismology landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes and thus allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. We present here a review of factors contributing to earthquake triggered slope failures based on an "event-by-event" classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, 'Intensity', 'Fault', 'Topographic energy', 'Climatic conditions' and 'Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. The relative weight of these factors was extracted from published data for numerous past earthquakes; topographic inputs were checked in Google Earth and through geographic information systems. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be cross-checked. One of our main findings is that the 'Fault' factor, which is based on characteristics of the fault, the surface rupture and its location with respect to mountain areas, has the most important

  9. Earthquake Safety Tips in the Classroom

    NASA Astrophysics Data System (ADS)

    Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M.

    2014-12-01

    The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in

  10. Earthquakes: Recurrence and Interoccurrence Times

    NASA Astrophysics Data System (ADS)

    Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.; Yakovlev, G.; Goltz, C.; Newman, W. I.

    2008-04-01

    The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.

  11. Earthquakes, July-August, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, on August 6, central California experienced a moderately strong earthquake, which injured several people and caused some damage. A number of earthquakes occurred in other parts of the United States but caused very little damage. 

  12. The physics of an earthquake

    NASA Astrophysics Data System (ADS)

    McCloskey, John

    2008-03-01

    The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.

  13. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  14. Attention bias in earthquake-exposed survivors: an event-related potential study.

    PubMed

    Zhang, Yan; Kong, Fanchang; Han, Li; Najam Ul Hasan, Abbasi; Chen, Hong

    2014-12-01

    The Chinese Wenchuan earthquake, which happened on the 28th of May in 2008, may leave deep invisible scars in individuals. China has a large number of children and adolescents, who tend to be most vulnerable because they are in an early stage of human development and possible post-traumatic psychological distress may have a life-long consequence. Trauma survivors without post-traumatic stress disorder (PTSD) have received little attention in previous studies, especially in event-related potential (ERP) studies. We compared the attention bias to threat stimuli between the earthquake-exposed group and the control group in a masked version of the dot probe task. The target probe presented at the same space location consistent with earthquake-related words was the congruent trial, while in the space location of neutral words was the incongruent trial. Thirteen earthquake-exposed middle school students without PTSD and 13 matched controls were included in this investigation. The earthquake-exposed group showed significantly faster RTs to congruent trials than to incongruent trials. The earthquake-exposed group produced significantly shorter C1 and P1 latencies and larger C1, P1 and P2 amplitudes than the control group. In particular, enhanced P1 amplitude to threat stimuli was observed in the earthquake-exposed group. These findings are in agreement with the prediction that earthquake-exposed survivors have an attention bias to threat stimuli. The traumatic event had a much greater effect on earthquake-exposed survivors even if they showed no PTSD symptoms than individuals in the controls. These results will provide neurobiological evidences for effective intervention and prevention to post-traumatic mental problems. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Comparing population exposure to multiple Washington earthquake scenarios for prioritizing loss estimation studies

    USGS Publications Warehouse

    Wood, Nathan J.; Ratliff, Jamie L.; Schelling, John; Weaver, Craig S.

    2014-01-01

    Scenario-based, loss-estimation studies are useful for gauging potential societal impacts from earthquakes but can be challenging to undertake in areas with multiple scenarios and jurisdictions. We present a geospatial approach using various population data for comparing earthquake scenarios and jurisdictions to help emergency managers prioritize where to focus limited resources on data development and loss-estimation studies. Using 20 earthquake scenarios developed for the State of Washington (USA), we demonstrate how a population-exposure analysis across multiple jurisdictions based on Modified Mercalli Intensity (MMI) classes helps emergency managers understand and communicate where potential loss of life may be concentrated and where impacts may be more related to quality of life. Results indicate that certain well-known scenarios may directly impact the greatest number of people, whereas other, potentially lesser-known, scenarios impact fewer people but consequences could be more severe. The use of economic data to profile each jurisdiction’s workforce in earthquake hazard zones also provides additional insight on at-risk populations. This approach can serve as a first step in understanding societal impacts of earthquakes and helping practitioners to efficiently use their limited risk-reduction resources.

  16. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  17. Volcanotectonic earthquakes induced by propagating dikes

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust

    2016-04-01

    Volcanotectonic earthquakes are of high frequency and mostly generated by slip on faults. During chamber expansion/contraction earthquakes are distribution in the chamber roof. Following magma-chamber rupture and dike injection, however, earthquakes tend to concentrate around the dike and follow its propagation path, resulting in an earthquake swarm characterised by a number of earthquakes of similar magnitudes. I distinguish between two basic processes by which propagating dikes induce earthquakes. One is due to stress concentration in the process zone at the tip of the dike, the other relates to stresses induced in the walls and surrounding rocks on either side of the dike. As to the first process, some earthquakes generated at the dike tip are related to pure extension fracturing as the tip advances and the dike-path forms. Formation of pure extension fractures normally induces non-double couple earthquakes. There is also shear fracturing in the process zone, however, particularly normal faulting, which produces double-couple earthquakes. The second process relates primarily to slip on existing fractures in the host rock induced by the driving pressure of the propagating dike. Such pressures easily reach 5-20 MPa and induce compressive and shear stresses in the adjacent host rock, which already contains numerous fractures (mainly joints) of different attitudes. In piles of lava flows or sedimentary beds the original joints are primarily vertical and horizontal. Similarly, the contacts between the layers/beds are originally horizontal. As the layers/beds become buried, the joints and contacts become gradually tilted so that the joints and contacts become oblique to the horizontal compressive stress induced by a driving pressure of the (vertical) dike. Also, most of the hexagonal (or pentagonal) columnar joints in the lava flows are, from the beginning, oblique to an intrusive sheet of any attitude. Consequently, the joints and contacts function as potential shear

  18. Mitigating the consequences of future earthquakes in historical centres: what perspectives from the joined use of past information and geological-geophysical surveys?

    NASA Astrophysics Data System (ADS)

    Terenzio Gizzi, Fabrizio; Moscatelli, Massimiliano; Potenza, Maria Rosaria; Zotta, Cinzia; Simionato, Maurizio; Pileggi, Domenico; Castenetto, Sergio

    2015-04-01

    To mitigate the damage effects of earthquakes in urban areas and particularly in historical centres prone to high seismic hazard is an important task to be pursued. As a matter of fact, seismic history throughout the world informs us that earthquakes have caused deep changes in the ancient urban conglomerations due to their high building vulnerability. Furthermore, some quarters can be exposed to an increase of seismic actions if compared with adjacent areas due to the geological and/or topographical features of the site on which the historical centres lie. Usually, the strategies aimed to estimate the local seismic hazard make only use of the geological-geophysical surveys. Thorough this approach we do not draw any lesson from what happened as a consequences of past earthquakes. With this in mind, we present the results of a joined use of historical data and traditional geological-geophysical approach to analyse the effects of possible future earthquakes in historical centres. The research activity discussed here is arranged into a joint collaboration between the Department of Civil Protection of the Presidency of Council of Ministers, the Institute of Environmental Geology and Geoengineering and the Institute of Archaeological and Monumental Heritage of the National (Italian) Research Council. In order to show the results, we discuss the preliminary achievements of the integrated study carried out on two historical towns located in Southern Apennines, a portion of the Italian peninsula exposed to high seismic hazard. Taking advantage from these two test sites, we also discuss some methodological implications that could be taken as a reference in the seismic microzonation studies.

  19. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  20. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  1. 100 years after the Marsica earthquake: contribute of outreach activities

    NASA Astrophysics Data System (ADS)

    D'Addezio, Giuliana; Giordani, Azzurra; Valle, Veronica; Riposati, Daniela

    2015-04-01

    Many outreach events have been proposed by the scientific community to celebrate the Centenary of the January 13, 1915 earthquake, that devastated the Marsica territory, located in Central Apennines. The Laboratorio Divulgazione Scientifica e Attività Museali of the Istituto Nazionale di Geofisica e Vulcanologia (INGV's Laboratory for Outreach and Museum Activities) in Rome, has realised an interactive exhibition in the Castello Piccolomini, Celano (AQ), to retrace the many aspects of the earthquake disaster, in a region such as Abruzzo affected by several destructive earthquakes during its history. The initiatives represent an ideal opportunity for the development of new programs of communication and training on seismic risk and to spread the culture of prevention. The INGV is accredited with the Servizio Civile Nazionale (National Civic Service) and volunteers are involved in the project "Science and Outreach: a comprehensive approach to the divulgation of knowledge of Earth Sciences" starting in 2014. In this contest, volunteers had the opportunity to fully contribute to the exhibition, in particular, promoting and realising two panels concerning the social and environmental consequences of the Marsica earthquake. Describing the serious consequences of the earthquake, we may raise awareness about natural hazards and about the only effective action for earthquake defense: building with anti seismic criteria. After studies and researches conducted in libraries and via web, two themes have been developped: the serious problem of orphans and the difficult reconstruction. Heavy snowfalls and the presence of wolves coming from the high and wild surrounding mountains complicated the scenario and decelerated the rescue of the affected populations. It is important to underline that the earthquake was not the only devastating event in the country in 1915; another drammatic event was, in fact, the First World War. Whole families died and the still alive infants and

  2. Do Earthquakes Shake Stock Markets?

    PubMed

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  3. Extreme Subduction Earthquake Scenarios and their Economical Consequences for Mexico City and Guadalajara, Jalisco, Mexico

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Perea, N.

    2007-05-01

    The destructive effects of large magnitude, thrust subduction superficial (TSS) earthquakes on Mexico City (MC) and Guadalajara (G) has been shown in the recent centuries. For example, the 7/04/1845 a TSS earthquake with Ms 7+ and epicentral distance of about 250 km from MC occurred on the coast of the state of Guerrero, a Maximum Mercalli Modified Intensity (MMI) of IX-X was reported in MC. Furthermore, the 19/09/1985 a Ms 8.1, Mw 8.01, TSS earthquake with epicentral distance of about 340 km from MC occurred on the coast of the state of Michoacan, a maximum MMI of IX-X was reported in MC. Also, the largest, Ms 8.2, instrumentally observed TSS earthquake in Mexico, occurred in the Colima-Jalisco region the 3/06/1932, with epicentral distance of the order of 200 km from G in northwestern Mexico. The 9/10/1995 another similar event, Ms 7.4, Mw 8, with an epicentral distance of about 240 km from G, occurred in the same region and produced MMI IX in the epicentral zone and MMI up to VI in G. The frequency of occurrence of large TSS earthquakes in Mexico is poorly known, but it might vary from decades to centuries [1]. On the other hand, the first recordings of strong ground motions in MC dates from the early 1960´s and most of them were recorded after the 19/09/1985 earthquake. In G there is only one recording of the later event, and 13 for the one occurred the 9/10/1995 [2]. In order to fulfill the lack of strong ground motions records for large damaging TSS earthquakes, which could have an important economical impact on MC [3] and G, in this work we have modeled broadband synthetics (obtained with a hybrid model that has already been satisfactorily compared with observations of the 9/10/1995 Colima-Jalisco Mw 8 earthquake, [4]) expected in MC and G, associated to extreme magnitude Mw 8.5, TSS scenario earthquakes with epicenters in the so-called Guerrero gap and in the Colima-Jalisco zone, respectively. The proposed scenarios are based on the seismic history and up

  4. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  5. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  6. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  7. Investigating Earthquake-induced Landslides­a Historical Review

    NASA Astrophysics Data System (ADS)

    Keefer, D. K.; Geological Survey, Us; Park, Menlo; Usa, Ca

    , extensive to relatively complete inventories landslides have been prepared for a relatively small number of earthquakes. Through the 1960's and 1970's the best landslide inventories typically were complete only for a central affected area, although the first virtually complete inventory of a large earthquake was prepared for the M 7.6 Guatemala earthquake in 1976. Beginning in 1980, virtu- ally complete landslide inventories have prepared for several additional earthquakes in California, El Salvador, Japan, Italy, and Taiwan. Most of these used aerial pho- tography in combination with ground field studies, although the studies of the most recent of these events, in Taiwan, have also used satellite imagery, and three of the others (including the two smallest) were compiled largely from ground-based field 1 studies without aerial photography. Since 1989, digital mapping and GIS techniques have come into common use for mapping earthquake-induced landslides, and the use of these techniques has greatly enhanced the level of analysis that can be applied to earthquake-induced landslide occurrence. The first synthesis of data on earthquake- induced landslides, completed in 1984, defined the general characteristics of these landslides, derived relations between landslide occurrence on the one hand and geo- logic and seismic parameters on the other hand, and identified the types of hazards as- sociated with them. Since then, additional synthesis of worldwide data (1999) and na- tional data from New Zealand (1997), Greece (2000), and Italy (2000) have provided additional data on landslide characteristics and hazards and have extended, revised, and refined these relations. Recently completed studies have also identified areas with anomalous landslide distributions, have provided data for correlating the occurrence of landslides with a measure of local ground motion, have verified the occasional delayed triggering of landslides as a consequence of seismic shaking, and have identi- fied

  8. Earthquakes; July-August 1982

    USGS Publications Warehouse

    Person, W.J.

    1983-01-01

    During this reporting period, there were three major (7.0-7.9) earthquakes all in unpopulated areas. The quakes occurred north of Macquarie Island on July 7, in the Santa Cruz Islands on August 5, and south of Panama on August 19. In the United Stats, a number of earthquakes occurred, but no damage was reported. 

  9. The Application of Speaker Recognition Techniques in the Detection of Tsunamigenic Earthquakes

    NASA Astrophysics Data System (ADS)

    Gorbatov, A.; O'Connell, J.; Paliwal, K.

    2015-12-01

    Tsunami warning procedures adopted by national tsunami warning centres largely rely on the classical approach of earthquake location, magnitude determination, and the consequent modelling of tsunami waves. Although this approach is based on known physics theories of earthquake and tsunami generation processes, this may be the main shortcoming due to the need to satisfy minimum seismic data requirement to estimate those physical parameters. At least four seismic stations are necessary to locate the earthquake and a minimum of approximately 10 minutes of seismic waveform observation to reliably estimate the magnitude of a large earthquake similar to the 2004 Indian Ocean Tsunami Earthquake of M9.2. Consequently the total time to tsunami warning could be more than half an hour. In attempt to reduce the time of tsunami alert a new approach is proposed based on the classification of tsunamigenic and non tsunamigenic earthquakes using speaker recognition techniques. A Tsunamigenic Dataset (TGDS) was compiled to promote the development of machine learning techniques for application to seismic trace analysis and, in particular, tsunamigenic event detection, and compare them to existing seismological methods. The TGDS contains 227 off shore events (87 tsunamigenic and 140 non-tsunamigenic earthquakes with M≥6) from Jan 2000 to Dec 2011, inclusive. A Support Vector Machine classifier using a radial-basis function kernel was applied to spectral features derived from 400 sec frames of 3-comp. 1-Hz broadband seismometer data. Ten-fold cross-validation was used during training to choose classifier parameters. Voting was applied to the classifier predictions provided from each station to form an overall prediction for an event. The F1 score (harmonic mean of precision and recall) was chosen to rate each classifier as it provides a compromise between type-I and type-II errors, and due to the imbalance between the representative number of events in the tsunamigenic and non

  10. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  11. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    NASA Astrophysics Data System (ADS)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  12. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    USGS Publications Warehouse

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  13. Hiding earthquakes from scrupulous monitoring eyes of dense local seismic networks

    NASA Astrophysics Data System (ADS)

    Bogiatzis, P.; Ishii, M.; Kiser, E.

    2012-12-01

    Accurate and complete cataloguing of aftershocks is essential for a variety of purposes, including the estimation of the mainshock rupture area, the identification of seismic gaps, and seismic hazard assessment. However, immediately following large earthquakes, the seismograms recorded by local networks are noisy, with energy arriving from hundreds of aftershocks, in addition to different seismic phases interfering with one another. This causes deterioration in the performance of detection and location of earthquakes using conventional methods such as the S-P approach. This is demonstrated by results of back-projection analysis of teleseismic data showing that a significant number of events are undetected by the Japan Meteorological Agency, within the first twenty-four hours after the Mw9.0 Tohoku-oki, Japan earthquake. The spatial distribution of the hidden events is not arbitrary. Most of these earthquakes are located close to the trench, while some are located at the outer rise. Furthermore, there is a relatively sharp trench-parallel boundary separating the detected and undetected events. We investigate the cause of these hidden earthquakes using forward modeling. The calculation of raypaths for various source locations and takeoff angles with the "shooting" method suggests that this phenomenon is a consequence of the complexities associated with subducting slab. Laterally varying velocity structure defocuses the seismic energy from shallow earthquakes located near the trench and makes the observation of P and S arrivals difficult at stations situated on mainland Japan. Full waveform simulations confirm these results. Our forward calculations also show that the probability of detection is sensitive to the depth of the event. Shallower events near the trench are more difficult to detect than deeper earthquakes that are located inside the subducting plate for which the shadow-zone effect diminishes. The modeling effort is expanded to include three

  14. Preliminary results on earthquake triggered landslides for the Haiti earthquake (January 2010)

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Gorum, Tolga

    2010-05-01

    This study presents the first results on an analysis of the landslides triggered by the Ms 7.0 Haiti earthquake that occurred on January 12, 2010 in the boundary region of the Pacific Plate and the North American plate. The fault is a left lateral strike slip fault with a clear surface expression. According to the USGS earthquake information the Enriquillo-Plantain Garden fault system has not produced any major earthquake in the last 100 years, and historical earthquakes are known from 1860, 1770, 1761, 1751, 1684, 1673, and 1618, though none of these has been confirmed in the field as associated with this fault. We used high resolution satellite imagery available for the pre and post earthquake situations, which were made freely available for the response and rescue operations. We made an interpretation of all co-seismic landslides in the epicentral area. We conclude that the earthquake mainly triggered landslide in the northern slope of the fault-related valley and in a number of isolated area. The earthquake apparently didn't trigger many visible landslides within the slum areas on the slopes in the southern part of Port-au-Prince and Carrefour. We also used ASTER DEM information to relate the landslide occurrences with DEM derivatives.

  15. Napa earthquake: An earthquake in a highly connected world

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  16. Triggering of repeating earthquakes in central California

    USGS Publications Warehouse

    Wu, Chunquan; Gomberg, Joan; Ben-Naim, Eli; Johnson, Paul

    2014-01-01

    Dynamic stresses carried by transient seismic waves have been found capable of triggering earthquakes instantly in various tectonic settings. Delayed triggering may be even more common, but the mechanisms are not well understood. Catalogs of repeating earthquakes, earthquakes that recur repeatedly at the same location, provide ideal data sets to test the effects of transient dynamic perturbations on the timing of earthquake occurrence. Here we employ a catalog of 165 families containing ~2500 total repeating earthquakes to test whether dynamic perturbations from local, regional, and teleseismic earthquakes change recurrence intervals. The distance to the earthquake generating the perturbing waves is a proxy for the relative potential contributions of static and dynamic deformations, because static deformations decay more rapidly with distance. Clear changes followed the nearby 2004 Mw6 Parkfield earthquake, so we study only repeaters prior to its origin time. We apply a Monte Carlo approach to compare the observed number of shortened recurrence intervals following dynamic perturbations with the distribution of this number estimated for randomized perturbation times. We examine the comparison for a series of dynamic stress peak amplitude and distance thresholds. The results suggest a weak correlation between dynamic perturbations in excess of ~20 kPa and shortened recurrence intervals, for both nearby and remote perturbations.

  17. Earthquake triggering in southeast Africa following the 2012 Indian Ocean earthquake

    NASA Astrophysics Data System (ADS)

    Neves, Miguel; Custódio, Susana; Peng, Zhigang; Ayorinde, Adebayo

    2018-02-01

    In this paper we present evidence of earthquake dynamic triggering in southeast Africa. We analysed seismic waveforms recorded at 53 broad-band and short-period stations in order to identify possible increases in the rate of microearthquakes and tremor due to the passage of teleseismic waves generated by the Mw8.6 2012 Indian Ocean earthquake. We found evidence of triggered local earthquakes and no evidence of triggered tremor in the region. We assessed the statistical significance of the increase in the number of local earthquakes using β-statistics. Statistically significant dynamic triggering of local earthquakes was observed at 7 out of the 53 analysed stations. Two of these stations are located in the northeast coast of Madagascar and the other five stations are located in the Kaapvaal Craton, southern Africa. We found no evidence of dynamically triggered seismic activity in stations located near the structures of the East African Rift System. Hydrothermal activity exists close to the stations that recorded dynamic triggering, however, it also exists near the East African Rift System structures where no triggering was observed. Our results suggest that factors other than solely tectonic regime and geothermalism are needed to explain the mechanisms that underlie earthquake triggering.

  18. Filling a gap: Public talks about earthquake preparation and the 'Big One'

    NASA Astrophysics Data System (ADS)

    Reinen, L. A.

    2013-12-01

    Residents of southern California are aware they live in a seismically active area and earthquake drills have trained us to Duck-Cover-Hold On. While many of my acquaintance are familiar with what to do during an earthquake, few have made preparations for living with the aftermath of a large earthquake. The ShakeOut Scenario (Jones et al., USGS Open File Report 2008-1150) describes the physical, social, and economic consequences of a plausible M7.8 earthquake on the southernmost San Andreas Fault. While not detailing an actual event, the ShakeOut Scenario illustrates how individual and community preparation may improve the potential after-affects of a major earthquake in the region. To address the gap between earthquake drills and preparation in my community, for the past several years I have been giving public talks to promote understanding of: the science behind the earthquake predictions; why individual, as well as community, preparation is important; and, ways in which individuals can prepare their home and work environments. The public presentations occur in an array of venues, including elementary school and college classes, a community forum linked with the annual ShakeOut Drill, and local businesses including the local microbrewery. While based on the same fundamental information, each presentation is modified for audience and setting. Assessment of the impact of these talks is primarily anecdotal and includes an increase in the number of venues requesting these talks, repeat invitations, and comments from audience members (sometimes months or years after a talk). I will present elements of these talks, the background information used, and examples of how they have affected change in the earthquake preparedness of audience members. Discussion and suggestions (particularly about effective means of conducting rigorous long-term assessment) are strongly encouraged.

  19. An investigation into the socioeconomic aspects of two major earthquakes in Iran.

    PubMed

    Amini Hosseini, Kambod; Hosseinioon, Solmaz; Pooyan, Zhila

    2013-07-01

    An evaluation of the socioeconomic consequences of earthquakes is an essential part of the development of risk reduction and disaster management plans. However, these variables are not normally addressed sufficiently after strong earthquakes; researchers and relevant stakeholders focus primarily on the physical damage and casualties. The importance of the socioeconomic consequences of seismic events became clearer in Iran after the Bam earthquake on 26 December 2003, as demonstrated by the formulation and approval of various laws and ordinances. This paper reviews the country's regulatory framework in the light of the socioeconomic aspects of two major and destructive earthquakes: in Manjil-Rudbar in 1990, and in Bam in 2003. The results take the form of recommendations and practical strategies for incorporating the socioeconomic dimensions of earthquakes in disaster risk management planning. The results presented here can be applied in other countries with similar conditions to those of Iran in order to improve public preparedness and risk reduction. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  20. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  1. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions

    PubMed Central

    Burro, Roberto; Hall, Rob

    2017-01-01

    A major earthquake has a potentially highly traumatic impact on children’s psychological functioning. However, while many studies on children describe negative consequences in terms of mental health and psychiatric disorders, little is known regarding how the developmental processes of emotions can be affected following exposure to disasters. Objectives We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children’s emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. Method The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children’s understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. Results We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Conclusions Our data extend the generalizability of theoretical models on children’s psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and

  2. Intermediate-depth earthquakes facilitated by eclogitization-related stresses

    USGS Publications Warehouse

    Nakajima, Junichi; Uchida, Naoki; Shiina, Takahiro; Hasegawa, Akira; Hacker, Bradley R.; Kirby, Stephen H.

    2013-01-01

    Eclogitization of the basaltic and gabbroic layer in the oceanic crust involves a volume reduction of 10%–15%. One consequence of the negative volume change is the formation of a paired stress field as a result of strain compatibility across the reaction front. Here we use waveform analysis of a tiny seismic cluster in the lower crust of the downgoing Pacific plate and reveal new evidence in favor of this mechanism: tensional earthquakes lying 1 km above compressional earthquakes, and earthquakes with highly similar waveforms lying on well-defined planes with complementary rupture areas. The tensional stress is interpreted to be caused by the dimensional mismatch between crust transformed to eclogite and underlying untransformed crust, and the earthquakes are probably facilitated by reactivation of fossil faults extant in the subducting plate. These observations provide seismic evidence for the role of volume change–related stresses and, possibly, fluid-related embrittlement as viable processes for nucleating earthquakes in downgoing oceanic lithosphere.

  3. Comparison of hypocentre parameters of earthquakes in the Aegean region

    NASA Astrophysics Data System (ADS)

    Özel, Nurcan M.; Shapira, Avi; Harris, James

    2007-06-01

    The Aegean Sea is one of the more seismically active areas in the Euro-Mediterranean region. The seismic activity in the Aegean Sea is monitored by a number of local agencies that contribute their data to the International Seismological Centre (ISC). Consequently, the ISC Bulletin may serve as a reliable reference for assessing the capabilities of local agencies to monitor moderate and low magnitude earthquakes. We have compared bulletins of the Kandilli Observatory and Earthquake Research Institute (KOERI) and the ISC, for the period 1976-2003 that comprises the most complete data sets for both KOERI and ISC. The selected study area is the East Aegean Sea and West Turkey, bounded by latitude 35-41°N and by longitude 24-29°E. The total number of events known to occur in this area, during 1976-2003 is about 41,638. Seventy-two percent of those earthquakes were located by ISC and 75% were located by KOERI. As expected, epicentre location discrepancy between ISC and KOERI solutions are larger as we move away from the KOERI seismic network. Out of the 22,066 earthquakes located by both ISC and KOERI, only 4% show a difference of 50 km or more. About 140 earthquakes show a discrepancy of more than 100 km. Focal Depth determinations differ mainly in the subduction zone along the Hellenic arc. Less than 2% of the events differ in their focal depth by more than 25 km. Yet, the location solutions of about 30 events differ by more than 100 km. Almost a quarter of the events listed in the ISC Bulletin are missed by KOERI, most of them occurring off the coast of Turkey, in the East Aegean. Based on the frequency-magnitude distributions, the KOERI Bulletin is complete for earthquakes with duration magnitudes Md > 2.7 (both located and assigned magnitudes) where as the threshold magnitude for events with location and magnitude determinations by ISC is mb > 4.0. KOERI magnitudes seem to be poorly correlated with ISC magnitudes suggesting relatively high uncertainty in the

  4. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    NASA Astrophysics Data System (ADS)

    Yao, Y. B.; Chen, P.; Zhang, S.; Chen, J. J.; Yan, F.; Peng, W. F.

    2012-03-01

    The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC) from the global ionosphere map (GIM). We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0-2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time). Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  5. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  6. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  7. Induced earthquake magnitudes are as large as (statistically) expected

    USGS Publications Warehouse

    Van Der Elst, Nicholas; Page, Morgan T.; Weiser, Deborah A.; Goebel, Thomas; Hosseini, S. Mehran

    2016-01-01

    A major question for the hazard posed by injection-induced seismicity is how large induced earthquakes can be. Are their maximum magnitudes determined by injection parameters or by tectonics? Deterministic limits on induced earthquake magnitudes have been proposed based on the size of the reservoir or the volume of fluid injected. However, if induced earthquakes occur on tectonic faults oriented favorably with respect to the tectonic stress field, then they may be limited only by the regional tectonics and connectivity of the fault network. In this study, we show that the largest magnitudes observed at fluid injection sites are consistent with the sampling statistics of the Gutenberg-Richter distribution for tectonic earthquakes, assuming no upper magnitude bound. The data pass three specific tests: (1) the largest observed earthquake at each site scales with the log of the total number of induced earthquakes, (2) the order of occurrence of the largest event is random within the induced sequence, and (3) the injected volume controls the total number of earthquakes rather than the total seismic moment. All three tests point to an injection control on earthquake nucleation but a tectonic control on earthquake magnitude. Given that the largest observed earthquakes are exactly as large as expected from the sampling statistics, we should not conclude that these are the largest earthquakes possible. Instead, the results imply that induced earthquake magnitudes should be treated with the same maximum magnitude bound that is currently used to treat seismic hazard from tectonic earthquakes.

  8. Strike-slip earthquakes can also be detected in the ionosphere

    NASA Astrophysics Data System (ADS)

    Astafyeva, Elvira; Rolland, Lucie M.; Sladen, Anthony

    2014-11-01

    It is generally assumed that co-seismic ionospheric disturbances are generated by large vertical static displacements of the ground during an earthquake. Consequently, it is expected that co-seismic ionospheric disturbances are only observable after earthquakes with a significant dip-slip component. Therefore, earthquakes dominated by strike-slip motion, i.e. with very little vertical co-seismic component, are not expected to generate ionospheric perturbations. In this work, we use total electron content (TEC) measurements from ground-based GNSS-receivers to study ionospheric response to six recent largest strike-slip earthquakes: the Mw7.8 Kunlun earthquake of 14 November 2001, the Mw8.1 Macquarie earthquake of 23 December 2004, the Sumatra earthquake doublet, Mw8.6 and Mw8.2, of 11 April 2012, the Mw7.7 Balochistan earthquake of 24 September 2013 and the Mw 7.7 Scotia Sea earthquake of 17 November 2013. We show that large strike-slip earthquakes generate large ionospheric perturbations of amplitude comparable with those induced by dip-slip earthquakes of equivalent magnitude. We consider that in the absence of significant vertical static co-seismic displacements of the ground, other seismological parameters (primarily the magnitude of co-seismic horizontal displacements, seismic fault dimensions, seismic slip) may contribute in generation of large-amplitude ionospheric perturbations.

  9. Thermal Radiation Anomalies Associated with Major Earthquakes

    NASA Technical Reports Server (NTRS)

    Ouzounov, Dimitar; Pulinets, Sergey; Kafatos, Menas C.; Taylor, Patrick

    2017-01-01

    Recent developments of remote sensing methods for Earth satellite data analysis contribute to our understanding of earthquake related thermal anomalies. It was realized that the thermal heat fluxes over areas of earthquake preparation is a result of air ionization by radon (and other gases) and consequent water vapor condensation on newly formed ions. Latent heat (LH) is released as a result of this process and leads to the formation of local thermal radiation anomalies (TRA) known as OLR (outgoing Longwave radiation, Ouzounov et al, 2007). We compare the LH energy, obtained by integrating surface latent heat flux (SLHF) over the area and time with released energies associated with these events. Extended studies of the TRA using the data from the most recent major earthquakes allowed establishing the main morphological features. It was also established that the TRA are the part of more complex chain of the short-term pre-earthquake generation, which is explained within the framework of a lithosphere-atmosphere coupling processes.

  10. Has El Salvador Fault Zone produced M ≥ 7.0 earthquakes? The 1719 El Salvador earthquake

    NASA Astrophysics Data System (ADS)

    Canora, C.; Martínez-Díaz, J.; Álvarez-Gómez, J.; Villamor, P.; Ínsua-Arévalo, J.; Alonso-Henar, J.; Capote, R.

    2013-05-01

    Historically, large earthquakes, Mw ≥ 7.0, in the Εl Salvador area have been attributed to activity in the Cocos-Caribbean subduction zone. Τhis is correct for most of the earthquakes of magnitude greater than 6.5. However, recent paleoseismic evidence points to the existence of large earthquakes associated with rupture of the Εl Salvador Fault Ζone, an Ε-W oriented strike slip fault system that extends for 150 km through central Εl Salvador. Τo calibrate our results from paleoseismic studies, we have analyzed the historical seismicity of the area. In particular, we suggest that the 1719 earthquake can be associated with paleoseismic activity evidenced in the Εl Salvador Fault Ζone. Α reinterpreted isoseismal map for this event suggests that the damage reported could have been a consequence of the rupture of Εl Salvador Fault Ζone, rather than rupture of the subduction zone. Τhe isoseismal is not different to other upper crustal earthquakes in similar tectonovolcanic environments. We thus challenge the traditional assumption that only the subduction zone is capable of generating earthquakes of magnitude greater than 7.0 in this region. Τhis result has broad implications for future risk management in the region. Τhe potential occurrence of strong ground motion, significantly higher and closer to the Salvadorian populations that those assumed to date, must be considered in seismic hazard assessment studies in this area.

  11. Update earthquake risk assessment in Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  12. Global earthquake fatalities and population

    USGS Publications Warehouse

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  13. The October 12, 1992, Dahshur, Egypt, Earthquake

    USGS Publications Warehouse

    Thenhaus, P.C.; Celebi, M.; Sharp, R.V.

    1993-01-01

    We were part of an international reconnaissance team that investigated the Dahsur earthquake. This article summarizes our findings and points out how even a relatively moderate sized earthquake can cause widespread damage and a large number of casualities. 

  14. Evidence for Ancient Mesoamerican Earthquakes

    NASA Astrophysics Data System (ADS)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  15. Earthquake damage orientation to infer seismic parameters in archaeological sites and historical earthquakes

    NASA Astrophysics Data System (ADS)

    Martín-González, Fidel

    2018-01-01

    Studies to provide information concerning seismic parameters and seismic sources of historical and archaeological seismic events are used to better evaluate the seismic hazard of a region. This is of especial interest when no surface rupture is recorded or the seismogenic fault cannot be identified. The orientation pattern of the earthquake damage (ED) (e.g., fallen columns, dropped key stones) that affected architectonic elements of cities after earthquakes has been traditionally used in historical and archaeoseismological studies to infer seismic parameters. However, in the literature depending on the authors, the parameters that can be obtained are contradictory (it has been proposed: the epicenter location, the orientation of the P-waves, the orientation of the compressional strain and the fault kinematics) and authors even question these relations with the earthquake damage. The earthquakes of Lorca in 2011, Christchurch in 2011 and Emilia Romagna in 2012 present an opportunity to measure systematically a large number and wide variety of earthquake damage in historical buildings (the same structures that are used in historical and archaeological studies). The damage pattern orientation has been compared with modern instrumental data, which is not possible in historical and archaeoseismological studies. From measurements and quantification of the orientation patterns in the studied earthquakes, it is observed that there is a systematic pattern of the earthquake damage orientation (EDO) in the proximity of the seismic source (fault trace) (<10 km). The EDO in these earthquakes is normal to the fault trend (±15°). This orientation can be generated by a pulse of motion that in the near fault region has a distinguishable acceleration normal to the fault due to the polarization of the S-waves. Therefore, the earthquake damage orientation could be used to estimate the seismogenic fault trend of historical earthquakes studies where no instrumental data are available.

  16. Hypothesis testing and earthquake prediction.

    PubMed

    Jackson, D D

    1996-04-30

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

  17. Hypothesis testing and earthquake prediction.

    PubMed Central

    Jackson, D D

    1996-01-01

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

  18. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

  19. Did the Zipingpu Reservoir trigger the 2008 Wenchuan earthquake?

    USGS Publications Warehouse

    Ge, S.; Liu, M.; Lu, N.; Godt, J.W.; Luo, G.

    2009-01-01

    The devastating May 2008 Wenchuan earthquake (Mw 7.9) resulted from thrust of the Tibet Plateau on the Longmen Shan fault zone, a consequence of the Indo-Asian continental collision. Many have speculated on the role played by the Zipingpu Reservoir, impounded in 2005 near the epicenter, in triggering the earthquake. This study evaluates the stress changes in response to the impoundment of the Zipingpu Reservoir and assesses their impact on the Wenchuan earthquake. We show that the impoundment could have changed the Coulomb stress by -0.01 to 0.05 MPa at locations and depth consistent with reported hypocenter positions. This level of stress change has been shown to be significant in triggering earthquakes on critically stressed faults. Because the loading rate on the Longmen Shan fault is <0.005 MPa/yr, we thus suggest that the Zipingpu Reservoir potentially hastened the occurrence of the Wenchuan earthquake by tens to hundreds of years. Copyright 2009 by the American Geophysical Union.

  20. Comparison of aftershock sequences between 1975 Haicheng earthquake and 1976 Tangshan earthquake

    NASA Astrophysics Data System (ADS)

    Liu, B.

    2017-12-01

    The 1975 ML 7.3 Haicheng earthquake and the 1976 ML 7.8 Tangshan earthquake occurred in the same tectonic unit. There are significant differences in spatial-temporal distribution, number of aftershocks and time duration for the aftershock sequence followed by these two main shocks. As we all know, aftershocks could be triggered by the regional seismicity change derived from the main shock, which was caused by the Coulomb stress perturbation. Based on the rate- and state- dependent friction law, we quantitative estimated the possible aftershock time duration with a combination of seismicity data, and compared the results from different approaches. The results indicate that, aftershock time durations from the Tangshan main shock is several times of that form the Haicheng main shock. This can be explained by the significant relationship between aftershock time duration and earthquake nucleation history, normal stressand shear stress loading rateon the fault. In fact the obvious difference of earthquake nucleation history from these two main shocks is the foreshocks. 1975 Haicheng earthquake has clear and long foreshocks, while 1976 Tangshan earthquake did not have clear foreshocks. In that case, abundant foreshocks may mean a long and active nucleation process that may have changed (weakened) the rocks in the source regions, so they should have a shorter aftershock sequences for the reason that stress in weak rocks decay faster.

  1. New ideas about the physics of earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, John B.; Klein, William

    1995-07-01

    It may be no exaggeration to claim that this most recent quaddrenium has seen more controversy and thus more progress in understanding the physics of earthquakes than any in recent memory. The most interesting development has clearly been the emergence of a large community of condensed matter physicists around the world who have begun working on the problem of earthquake physics. These scientists bring to the study of earthquakes an entirely new viewpoint, grounded in the physics of nucleation and critical phenomena in thermal, magnetic, and other systems. Moreover, a surprising technology transfer from geophysics to other fields has been made possible by the realization that models originally proposed to explain self-organization in earthquakes can also be used to explain similar processes in problems as disparate as brain dynamics in neurobiology (Hopfield, 1994), and charge density waves in solids (Brown and Gruner, 1994). An entirely new sub-discipline is emerging that is focused around the development and analysis of large scale numerical simulations of the dynamics of faults. At the same time, intriguing new laboratory and field data, together with insightful physical reasoning, has led to significant advances in our understanding of earthquake source physics. As a consequence, we can anticipate substantial improvement in our ability to understand the nature of earthquake occurrence. Moreover, while much research in the area of earthquake physics is fundamental in character, the results have many potential applications (Cornell et al., 1993) in the areas of earthquake risk and hazard analysis, and seismic zonation.

  2. Discussion of New Approaches to Medium-Short-Term Earthquake Forecast in Practice of The Earthquake Prediction in Yunnan

    NASA Astrophysics Data System (ADS)

    Hong, F.

    2017-12-01

    After retrospection of years of practice of the earthquake prediction in Yunnan area, it is widely considered that the fixed-point earthquake precursory anomalies mainly reflect the field information. The increase of amplitude and number of precursory anomalies could help to determine the original time of earthquakes, however it is difficult to obtain the spatial relevance between earthquakes and precursory anomalies, thus we can hardly predict the spatial locations of earthquakes using precursory anomalies. The past practices have shown that the seismic activities are superior to the precursory anomalies in predicting earthquakes locations, resulting from the increased seismicity were observed before 80% M=6.0 earthquakes in Yunnan area. While the mobile geomagnetic anomalies are turned out to be helpful in predicting earthquakes locations in recent year, for instance, the forecasted earthquakes occurring time and area derived form the 1-year-scale geomagnetic anomalies before the M6.5 Ludian earthquake in 2014 are shorter and smaller than which derived from the seismicity enhancement region. According to the past works, the author believes that the medium-short-term earthquake forecast level, as well as objective understanding of the seismogenic mechanisms, could be substantially improved by the densely laying observation array and capturing the dynamic process of physical property changes in the enhancement region of medium to small earthquakes.

  3. Investigating landslides caused by earthquakes - A historical review

    USGS Publications Warehouse

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  4. Regional and Local Glacial-Earthquake Patterns in Greenland

    NASA Astrophysics Data System (ADS)

    Olsen, K.; Nettles, M.

    2016-12-01

    Icebergs calved from marine-terminating glaciers currently account for up to half of the 400 Gt of ice lost annually from the Greenland ice sheet (Enderlin et al., 2014). When large capsizing icebergs ( 1 Gt of ice) calve, they produce elastic waves that propagate through the solid earth and are observed as teleseismically detectable MSW 5 glacial earthquakes (e.g., Ekström et al., 2003; Nettles & Ekström, 2010 Tsai & Ekström, 2007; Veitch & Nettles, 2012). The annual number of these events has increased dramatically over the past two decades. We analyze glacial earthquakes from 2011-2013, which expands the glacial-earthquake catalog by 50%. The number of glacial-earthquake solutions now available allows us to investigate regional patterns across Greenland and link earthquake characteristics to changes in ice dynamics at individual glaciers. During the years of our study Greenland's west coast dominated glacial-earthquake production. Kong Oscar Glacier, Upernavik Isstrøm, and Jakobshavn Isbræ all produced more glacial earthquakes during this time than in preceding years. We link patterns in glacial-earthquake production and cessation to the presence or absence of floating ice tongues at glaciers on both coasts of Greenland. The calving model predicts glacial-earthquake force azimuths oriented perpendicular to the calving front, and comparisons between seismic data and satellite imagery confirm this in most instances. At two glaciers we document force azimuths that have recently changed orientation and confirm that similar changes have occurred in the calving-front geometry. We also document glacial earthquakes at one previously quiescent glacier. Consistent with previous work, we model the glacial-earthquake force-time function as a boxcar with horizontal and vertical force components that vary synchronously. We investigate limitations of this approach and explore improvements that could lead to a more accurate representation of the glacial earthquake source.

  5. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  6. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    NASA Astrophysics Data System (ADS)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    Recently our understanding of tectonic faulting has been shaken by the discoveries of seismic tremor, low frequency earthquakes, slow slip events, and other models of fault slip. These phenomenas represent models of failure that were thought to be non-existent and theoretically impossible only a few years ago. Slow earthquakes are seismic phenomena in which the rupture of geological faults in the earth's crust occurs gradually without creating strong tremors. Despite the growing number of observations of slow earthquakes their origin remains unresolved. Studies show that the duration of slow earthquakes ranges from a few seconds to a few hundred seconds. The regular earthquakes with which most people are familiar release a burst of built-up stress in seconds, slow earthquakes release energy in ways that do little damage. This study focus on the characteristics of the Mw5.6 earthquake occurred in Sofia seismic zone on May 22nd, 2012. The Sofia area is the most populated, industrial and cultural region of Bulgaria that faces considerable earthquake risk. The Sofia seismic zone is located in South-western Bulgaria - the area with pronounce tectonic activity and proved crustal movement. In 19th century the city of Sofia (situated in the centre of the Sofia seismic zone) has experienced two strong earthquakes with epicentral intensity of 10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK64).The 2012 quake occurs in an area characterized by a long quiescence (of 95 years) for moderate events. Moreover, a reduced number of small earthquakes have also been registered in the recent past. The Mw5.6 earthquake is largely felt on the territory of Bulgaria and neighbouring countries. No casualties and severe injuries have been reported. Mostly moderate damages were observed in the cities of Pernik and Sofia and their surroundings. These observations could be assumed indicative for a

  7. On the of neural modeling of some dynamic parameters of earthquakes and fire safety in high-rise construction

    NASA Astrophysics Data System (ADS)

    Haritonova, Larisa

    2018-03-01

    The recent change in the correlation of the number of man-made and natural catastrophes is presented in the paper. Some recommendations are proposed to increase the firefighting efficiency in the high-rise buildings. The article analyzes the methodology of modeling seismic effects. The prospectivity of applying the neural modeling and artificial neural networks to analyze a such dynamic parameters of the earthquake foci as the value of dislocation (or the average rupture slip) is shown. The following two input signals were used: the power class and the number of earthquakes. The regression analysis has been carried out for the predicted results and the target outputs. The equations of the regression for the outputs and target are presented in the work as well as the correlation coefficients in training, validation, testing, and the total (All) for the network structure 2-5-5-1for the average rupture slip. The application of the results obtained in the article for the seismic design for the newly constructed buildings and structures and the given recommendations will provide the additional protection from fire and earthquake risks, reduction of their negative economic and environmental consequences.

  8. Earthquakes

    MedlinePlus

    ... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

  9. Investigating Landslides Caused by Earthquakes A Historical Review

    NASA Astrophysics Data System (ADS)

    Keefer, David K.

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  10. PAGER - Rapid Assessment of an Earthquake's Impact

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.

    2007-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system to rapidly assess the number of people and regions exposed to severe shaking by an earthquake, and inform emergency responders, government agencies, and the media to the scope of the potential disaster. PAGER monitors the U.S. Geological Survey?s near real-time U.S. and global earthquake detections and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts.

  11. Earthquakes & Volcanoes, Volume 21, Number 1, 1989: Featuring the U.S. Geological Survey's National Earthquake Information Center in Golden, Colorado, USA

    USGS Publications Warehouse

    ,; Spall, Henry; Schnabel, Diane C.

    1989-01-01

    Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers. The Secretary of the Interior has determined that the publication of this periodical is necessary in the transaction of the public business required by law of this Department. Use of funds for printing this periodical has been approved by the Office of Management and Budget through June 30, 1989. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  12. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  13. The HayWired Earthquake Scenario

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    interconnectedness of infrastructure, society, and our economy. How would this earthquake scenario, striking close to Silicon Valley, impact our interconnected world in ways and at a scale we have not experienced in any previous domestic earthquake?The area of present-day Contra Costa, Alameda, and Santa Clara Counties contended with a magnitude-6.8 earthquake in 1868 on the Hayward Fault. Although sparsely populated then, about 30 people were killed and extensive property damage resulted. The question of what an earthquake like that would do today has been examined before and is now revisited in the HayWired scenario. Scientists have documented a series of prehistoric earthquakes on the Hayward Fault and are confident that the threat of a future earthquake, like that modeled in the HayWired scenario, is real and could happen at any time. The team assembled to build this scenario has brought innovative new approaches to examining the natural hazards, impacts, and consequences of such an event. Such an earthquake would also be accompanied by widespread liquefaction and landslides, which are treated in greater detail than ever before. The team also considers how the now-prototype ShakeAlert earthquake early warning system could provide useful public alerts and automatic actions.Scientific Investigations Report 2017–5013 and accompanying data releases are the products of an effort led by the USGS, but this body of work was created through the combined efforts of a large team including partners who have come together to form the HayWired Coalition (see chapter A). Use of the HayWired scenario has already begun. More than a full year of intensive partner engagement, beginning in April 2017, is being directed toward producing the most in-depth look ever at the impacts and consequences of a large earthquake on the Hayward Fault. With the HayWired scenario, our hope is to encourage and support the active ongoing engagement of the entire community of the San Francisco Bay region by

  14. Earthquake Loss Scenarios: Warnings about the Extent of Disasters

    NASA Astrophysics Data System (ADS)

    Wyss, M.; Tolis, S.; Rosset, P.

    2016-12-01

    It is imperative that losses expected due to future earthquakes be estimated. Officials and the public need to be aware of what disaster is likely in store for them in order to reduce the fatalities and efficiently help the injured. Scenarios for earthquake parameters can be constructed to a reasonable accuracy in highly active earthquake belts, based on knowledge of seismotectonics and history. Because of the inherent uncertainties of loss estimates however, it would be desirable that more than one group calculate an estimate for the same area. By discussing these estimates, one may find a consensus of the range of the potential disasters and persuade officials and residents of the reality of the earthquake threat. To model a scenario and estimate earthquake losses requires data sets that are sufficiently accurate of the number of people present, the built environment, and if possible the transmission of seismic waves. As examples we use loss estimates for possible repeats of historic earthquakes in Greece that occurred between -464 and 700. We model future large Greek earthquakes as having M6.8 and rupture lengths of 60 km. In four locations where historic earthquakes with serious losses have occurred, we estimate that 1,000 to 1,500 people might perish, with an additional factor of four people injured. Defining the area of influence of these earthquakes as that with shaking intensities larger and equal to V, we estimate that 1.0 to 2.2 million people in about 2,000 settlements may be affected. We calibrate the QLARM tool for calculating intensities and losses in Greece, using the M6, 1999 Athens earthquake and matching the isoseismal information for six earthquakes, which occurred in Greece during the last 140 years. Comparing fatality numbers that would occur theoretically today with the numbers reported, and correcting for the increase in population, we estimate that the improvement of the building stock has reduced the mortality and injury rate in Greek

  15. Using remote sensing to predict earthquake impacts

    NASA Astrophysics Data System (ADS)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  16. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  17. An evaluation of Health of the Nation Outcome Scales data to inform psychiatric morbidity following the Canterbury earthquakes.

    PubMed

    Beaglehole, Ben; Frampton, Chris M; Boden, Joseph M; Mulder, Roger T; Bell, Caroline J

    2017-11-01

    Following the onset of the Canterbury, New Zealand earthquakes, there were widespread concerns that mental health services were under severe strain as a result of adverse consequences on mental health. We therefore examined Health of the Nation Outcome Scales data to see whether this could inform our understanding of the impact of the Canterbury earthquakes on patients attending local specialist mental health services. Health of the Nation Outcome Scales admission data were analysed for Canterbury mental health services prior to and following the Canterbury earthquakes. These findings were compared to Health of the Nation Outcome Scales admission data from seven other large District Health Boards to delineate local from national trends. Percentage changes in admission numbers were also calculated before and after the earthquakes for Canterbury and the seven other large district health boards. Admission Health of the Nation Outcome Scales scores in Canterbury increased after the earthquakes for adult inpatient and community services, old age inpatient and community services, and Child and Adolescent inpatient services compared to the seven other large district health boards. Admission Health of the Nation Outcome Scales scores for Child and Adolescent community services did not change significantly, while admission Health of the Nation Outcome Scales scores for Alcohol and Drug services in Canterbury fell compared to other large district health boards. Subscale analysis showed that the majority of Health of the Nation Outcome Scales subscales contributed to the overall increases found. Percentage changes in admission numbers for the Canterbury District Health Board and the seven other large district health boards before and after the earthquakes were largely comparable with the exception of admissions to inpatient services for the group aged 4-17 years which showed a large increase. The Canterbury earthquakes were followed by an increase in Health of the Nation

  18. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    PubMed

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-05-10

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

  19. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    PubMed Central

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  20. Earthquakes in El Salvador: a descriptive study of health concerns in a rural community and the clinical implications, part I.

    PubMed

    Woersching, Joanna C; Snyder, Audrey E

    2003-01-01

    This is the first article in a series that evaluates the health concerns of people living in a Salvadoran rural community after major earthquakes. Part I reviews the background, methods, and results of post-earthquake conditions with regards to healthcare, access to healthcare, housing, food, water and sanitation. Part II reviews the implications of these results and recommendations for improvements within the community. Part III investigates the psychosocial and mental health consequences of the earthquakes and provides suggestions for improved mental health awareness, assessment, and intervention. El Salvador experienced 2 major earthquakes in January and February 2001. This study evaluates the effects of the earthquakes on the health practices in the rural town of San Sebastian. The research was conducted with use of a convenience sample survey of subjects affected by the earthquakes. The sample included 594 people within 100 households. The 32-question survey assessed post-earthquake conditions in the areas of health care and access to care, housing, food and water, and sanitation. Communicable diseases affected a number of family members. After the earthquakes, 38% of households reported new injuries, and 79% reported acute exacerbations of chronic illness. Rural inhabitants were 30% more likely to have an uninhabitable home than were urban inhabitants. Concerns included safe housing, water purification, and waste elimination. The findings indicate a need for greater public health awareness and community action to adapt living conditions after a disaster and prevent the spread of communicable disease.

  1. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  2. The Rotational and Gravitational Effect of Earthquakes

    NASA Technical Reports Server (NTRS)

    Gross, Richard

    2000-01-01

    The static displacement field generated by an earthquake has the effect of rearranging the Earth's mass distribution and will consequently cause the Earth's rotation and gravitational field to change. Although the coseismic effect of earthquakes on the Earth's rotation and gravitational field have been modeled in the past, no unambiguous observations of this effect have yet been made. However, the Gravity Recovery And Climate Experiment (GRACE) satellite, which is scheduled to be launched in 2001, will measure time variations of the Earth's gravitational field to high degree and order with unprecedented accuracy. In this presentation, the modeled coseismic effect of earthquakes upon the Earth's gravitational field to degree and order 100 will be computed and compared to the expected accuracy of the GRACE measurements. In addition, the modeled second degree changes, corresponding to changes in the Earth's rotation, will be compared to length-of-day and polar motion excitation observations.

  3. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

    PubMed

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

  4. Spatial Analysis of Earthquake Fatalities in the Middle East, 1970-2008: First Results

    NASA Astrophysics Data System (ADS)

    Khaleghy Rad, M.; Evans, S. G.; Brenning, A.

    2010-12-01

    Earthquakes claim the lives of thousands of people each year and the annual number of earthquake fatalities in the Middle East (21 countries) is 20 % of the total yearly fatalities of the World. There have been several attempts to estimate the number of fatalities in a given earthquake. We review the results of previous attempts and present an estimation of fatalities using a new conceptual model for life loss that includes hazard (earthquake magnitude and focal depth), vulnerability (GDP value of countries and elapsed time since 1970 as proxy variables) and exposed population in the affected area of a given earthquake. PAGER_CAT is a global catalog (http://earthquake.usgs.gov/research/data/pager/) that presents information on casualties of earthquakes since 1900. Although, the catalog itself is almost a complete record of fatal earthquakes, the data on number of deaths is not complete. We use PAGER_CAT to assemble a Middle East (the latitude and longitude of 10°-42° N and 24°-64° E respectively) catalog for the period 1970-2008 that includes 202 events with published number of fatalities, including events with zero casualties. We investigated the effect of components of each event, e.g. exposed population, instrumental earthquake magnitude, focal depth, date (year of event) and GDP on earthquake fatalities in Middle East in the 202 events with detailed fatality estimates. To estimate the number of people exposed to each event, we used a fatality threshold for peak ground acceleration of 0.1g to calculate the radius of affected area. The exposed population of each event is the enclosed population of each circle calculated from gridded population data available on SEDAC (http://sedac.ciesin.columbia.edu/gpw/global.jsp) using ArcGIS. Results of our statistical model, using Poisson regression in R statistical software, show that the number of fatalities due to earthquakes is in direct (positive) relation to the exposed population and the magnitude of the

  5. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  6. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    contribution of building stock, its relative vulnerability, and distribution are vital components for determining the extent of casualties during an earthquake. It is evident from large deadly historical earthquakes that the distribution of vulnerable structures and their occupancy level during an earthquake control the severity of human losses. For example, though the number of strong earthquakes in California is comparable to that of Iran, the total earthquake-related casualties in California during the last 100 years are dramatically lower than the casualties from several individual Iranian earthquakes. The relatively low casualties count in California is attributed mainly to the fact that more than 90 percent of the building stock in California is made of wood and is designed to withstand moderate to large earthquakes (Kircher, Seligson and others, 2006). In contrast, the 80 percent adobe and or non-engineered masonry building stock with poor lateral load resisting systems in Iran succumbs even for moderate levels of ground shaking. Consequently, the heavy death toll for the 2003 Bam, Iran earthquake, which claimed 31,828 lives (Ghafory-Ashtiany and Mousavi, 2005), is directly attributable to such poorly resistant construction, and future events will produce comparable losses unless practices change. Similarly, multistory, precast-concrete framed buildings caused heavy casualties in the 1988 Spitak, Armenia earthquake (Bertero, 1989); weaker masonry and reinforced-concrete framed construction designed for gravity loads with soft first stories dominated losses in the Bhuj, India earthquake of 2001 (Madabhushi and Haigh, 2005); and adobe and weak masonry dwellings in Peru controlled the death toll in the Peru earthquake of 2007 (Taucer, J. and others, 2007). Spence (2007) after conducting a brief survey of most lethal earthquakes since 1960 found that building collapses remains a major cause of earthquake mortality and unreinforced masonry buildings are one of the mos

  7. The Parkfield earthquake prediction of October 1992; the emergency services response

    USGS Publications Warehouse

    Andrews, R.

    1992-01-01

    The science of earthquake prediction is interesting and worthy of support. In many respects the ultimate payoff of earthquake prediction or earthquake forecasting is how the information can be used to enhance public safety and public preparedness. This is a particularly important issue here in California where we have such a high level of seismic risk historically, and currently, as a consequence of activity in 1989 in the San Francisco Bay Area, in Humboldt County in April of this year (1992), and in southern California in the Landers-Big Bear area in late June of this year (1992). We are currently very concerned about the possibility of a major earthquake, one or more, happening close to one of our metropolitan areas. Within that context, the Parkfield experiment becomes very important. 

  8. Review of Injuries from Terrorist Bombings and Earthquakes

    DTIC Science & Technology

    2016-08-31

    distribution is unlimited. August 2016 Review of Injuries from Terrorist Bombings and Earthquakes DTRA-TR-16-064 T E C H N IC A L R E P...08-16 Technical Report 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Review of Injuries Types from Terrorist Bombings and Earthquakes HDTRA1-14-D-0003...Terrorist bombings and earthquakes provide valuable insight on the types of injuries that may occur in an improvised nuclear device (IND) scenario

  9. Living on an Active Earth: Perspectives on Earthquake Science

    NASA Astrophysics Data System (ADS)

    Lay, Thorne

    2004-02-01

    The annualized long-term loss due to earthquakes in the United States is now estimated at $4.4 billion per year. A repeat of the 1923 Kanto earthquake, near Tokyo, could cause direct losses of $2-3 trillion. With such grim numbers, which are guaranteed to make you take its work seriously, the NRC Committee on the Science of Earthquakes begins its overview of the emerging multidisciplinary field of earthquake science. An up-to-date and forward-looking survey of scientific investigation of earthquake phenomena and engineering response to associated hazards is presented at a suitable level for a general educated audience. Perspectives from the fields of seismology, geodesy, neo-tectonics, paleo-seismology, rock mechanics, earthquake engineering, and computer modeling of complex dynamic systems are integrated into a balanced definition of earthquake science that has never before been adequately articulated.

  10. Urban Earthquakes - Reducing Building Collapse Through Education

    NASA Astrophysics Data System (ADS)

    Bilham, R.

    2004-12-01

    Fatalities from earthquakes rose from 6000k to 9000k/year in the past decade, yet the ratio of numbers of earthquake fatalities to instantaneous population continues to fall. Since 1950 the ratio declined worldwide by a factor of three, but in some countries the ratio has changed little. E.g in Iran, 1 in 3000 people can expect to die in an earthquake, a percentage that has not changed significantly since 1890. Fatalities from earthquakes remain high in those countries that have traditionally suffered from frequent large earthquakes (Turkey, Iran, Japan, and China), suggesting that the exposure time of recently increased urban populations in other countries may be too short to have interacted with earthquakes with long recurrence intervals. This in turn, suggests that disasters of unprecendented size will occur (more than 1 million fatalities) when future large earthquakes occur close to megacities. However, population growth is most rapid in cities of less than 1 million people in the developing nations, where the financial ability to implement earthquake resistant construction methods is limited. In that structural collapse can often be traced to ignorance about the forces at work in an earthquake, the future collapse of buildings presently under construction could be much reduced were contractors, builders and occupants educated in the principles of earthquake resistant assembly. Education of builders who are tempted to cut assembly costs is likely to be more cost effective than material aid.

  11. Moment Magnitudes and Local Magnitudes for Small Earthquakes: Implications for Ground-Motion Prediction and b-values

    NASA Astrophysics Data System (ADS)

    Baltay, A.; Hanks, T. C.; Vernon, F.

    2016-12-01

    We illustrate two essential consequences of the systematic difference between moment magnitude and local magnitude for small earthquakes, illuminating the underlying earthquake physics. Moment magnitude, M 2/3 log M0, is uniformly valid for all earthquake sizes [Hanks and Kanamori, 1979]. However, the relationship between local magnitude ML and moment is itself magnitude dependent. For moderate events, 3< M < 7, M and M­L are coincident; for earthquakes smaller than M3, ML log M0 [Hanks and Boore, 1984]. This is a consequence of the saturation of the apparent corner frequency fc as it becoming greater than the largest observable frequency, fmax; In this regime, stress drop no longer controls ground motion. This implies that ML and M differ by a factor of 1.5 for these small events. While this idea is not new, its implications are important as more small-magnitude data are incorporated into earthquake hazard research. With a large dataset of M<3 earthquakes recorded on the ANZA network, we demonstrate striking consequences of the difference between M and ML. ML scales as the log peak ground motions (e.g., PGA or PGV) for these small earthquakes, which yields log PGA log M0 [Boore, 1986]. We plot nearly 15,000 records of PGA and PGV at close stations, adjusted for site conditions and for geometrical spreading to 10 km. The slope of the log of ground motion is 1.0*ML­, or 1.5*M, confirming the relationship, and that fc >> fmax. Just as importantly, if this relation is overlooked, prediction of large-magnitude ground motion from small earthquakes will be misguided. We also consider the effect of this magnitude scale difference on b-value. The oft-cited b-value of 1 should hold for small magnitudes, given M. Use of ML necessitates b=2/3 for the same data set; use of mixed, or unknown, magnitudes complicates the matter further. This is of particular import when estimating the rate of large earthquakes when one has limited data on their recurrence, as is the case for

  12. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  13. Oklahoma’s recent earthquakes and saltwater disposal

    PubMed Central

    Walsh, F. Rall; Zoback, Mark D.

    2015-01-01

    Over the past 5 years, parts of Oklahoma have experienced marked increases in the number of small- to moderate-sized earthquakes. In three study areas that encompass the vast majority of the recent seismicity, we show that the increases in seismicity follow 5- to 10-fold increases in the rates of saltwater disposal. Adjacent areas where there has been relatively little saltwater disposal have had comparatively few recent earthquakes. In the areas of seismic activity, the saltwater disposal principally comes from “produced” water, saline pore water that is coproduced with oil and then injected into deeper sedimentary formations. These formations appear to be in hydraulic communication with potentially active faults in crystalline basement, where nearly all the earthquakes are occurring. Although most of the recent earthquakes have posed little danger to the public, the possibility of triggering damaging earthquakes on potentially active basement faults cannot be discounted. PMID:26601200

  14. Perceptions of distress and positive consequences following exposure to a major disaster amongst a well-studied cohort.

    PubMed

    Fergusson, David M; Boden, Joseph M; Horwood, L John; Mulder, Roger T

    2015-04-01

    Research on the impact of natural disasters on health and well-being faces several methodological challenges, including: sampling issues; exposure assessment; and outcome measurement. The present study used a comprehensive measure of disaster exposure to assess relationships between exposure to the Canterbury (New Zealand) Earthquakes of 2010-2011 and both: (a) self-reported distress and (b) positive outcomes; and also investigated gender differences in reports. Data were gathered from the Christchurch Health and Development Study, a 35-year longitudinal study. The study examined data from 495 individuals exposed to the Canterbury Earthquakes for who complete data on exposure and reactions to the earthquakes at age 35 were available. Participants with higher levels of exposure to the earthquakes reported significantly (p<0.0001) higher levels of distress due to fear, death and injury, and disruption caused by the earthquakes. Higher levels of exposure to the earthquakes were also associated with significantly (p<0.0001) higher levels of reporting positive consequences following the earthquakes. Women reported significantly (p<0.0001) greater distress than men and significantly (p<0.001) greater positive consequences. Higher levels of exposure to disaster were associated with higher levels of distress, but also with higher levels of self-reported positive outcomes, with females reporting higher levels of both positive and negative outcomes. The findings highlight the need for comprehensive assessment of disaster exposure, to consider gender and other group differences in reactions to disaster exposure, and for studies of disasters to examine both positive and negative consequences. © The Royal Australian and New Zealand College of Psychiatrists 2014.

  15. 2016 update on induced earthquakes in the United States

    USGS Publications Warehouse

    Petersen, Mark D.

    2016-01-01

    During the past decade people living in numerous locations across the central U.S. experienced many more small to moderate sized earthquakes than ever before. This earthquake activity began increasing about 2009 and peaked during 2015 and into early 2016. For example, prior to 2009 Oklahoma typically experienced 1 or 2 small earthquakes per year with magnitude greater than 3.0 but by 2015 this number rose to over 900 earthquakes per year of that size and over 30 earthquakes greater than 4.0. These earthquakes can cause damage. In 2011 a magnitude 5.6 earthquake struck near the town of Prague, Oklahoma on a preexisting fault and caused severe damage to several houses and school buildings. During the past 6 years more than 1500 reports of damaging shaking levels were reported in areas of induced seismicity. This rapid increase and the potential for damaging ground shaking from induced earthquakes caused alarm to about 8 million people living nearby and officials responsible for public safety. They wanted to understand why earthquakes were increasing and the potential threats to society and buildings located nearby.

  16. How dynamic number of evacuee affects the multi-objective optimum allocation for earthquake emergency shelters: A case study in the central area of Beijing, China

    NASA Astrophysics Data System (ADS)

    Ma, Y.; Xu, W.; Zhao, X.; Qin, L.

    2016-12-01

    Accurate location and allocation of earthquake emergency shelters is a key component of effective urban planning and emergency management. A number of models have been developed to solve the complex location-allocation problem with diverse and strict constraints, but there still remain a big gap between the model and the actual situation because the uncertainty of earthquake, damage rate of buildings and evacuee behaviors have been neglected or excessively simplified in the existing models. An innovative model was first developed to estimate the hourly dynamic changes of the number of evacuees under two damage scenarios of earthquake by considering these factors at the community level based on a location-based service data, and then followed by a multi-objective model for the allocation of residents to earthquake shelters using the central area of Beijing, China as a case study. The two objectives of this shelter allocation model were to minimize the total evacuation distance from communities to a specified shelter and to minimize the total area of all the shelters with the constraints of shelter capacity and service radius. The modified particle swarm optimization algorithm was used to solve this model. The results show that increasing the shelter area will result in a large decrease of the total evacuation distance in all of the schemes of the four scenarios (i.e., Scenario A and B in daytime and nighttime respectively). According to the schemes of minimum distance, parts of communities in downtown area needed to be reallocated due to the insufficient capacity of the nearest shelters, and the numbers of these communities sequentially decreased in scenarios Ad, An, Bd and Bn due to the decreasing population. According to the schemes of minimum area in each scenario, 27 or 28 shelters, covering a total area of approximately 37 km2, were selected; and the communities almost evacuated using the same routes in different scenarios. The results can be used as a

  17. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  18. Glacial Earthquakes: Monitoring Greenland's Glaciers Using Broadband Seismic Data

    NASA Astrophysics Data System (ADS)

    Olsen, K.; Nettles, M.

    2017-12-01

    The Greenland ice sheet currently loses 400 Gt of ice per year, and up to half of that mass loss comes from icebergs calving from marine-terminating glaciers (Enderlin et al., 2014). Some of the largest icebergs produced by Greenland's glaciers generate magnitude 5 seismic signals when they calve. These glacial earthquakes are recorded by seismic stations around the world. Full-waveform inversion and analysis of glacial earthquakes provides a low-cost tool to identify where and when gigaton-sized icebergs calve, and to track this important mass-loss mechanism in near-real-time. Fifteen glaciers in Greenland are known to have produced glacial earthquakes, and the annual number of these events has increased by a factor of six over the past two decades (e.g., Ekström et al., 2006; Olsen and Nettles, 2017). Since 2000, the number of glacial earthquakes on Greenland's west coast has increased dramatically. Our analysis of three recent years of data shows that more glacial earthquakes occurred on Greenland's west coast from 2011 - 2013 than ever before. In some cases, glacial-earthquake force orientations allow us to identify which section of a glacier terminus produced the iceberg associated with a particular event. We are able to track the timing of major changes in calving-front orientation at several glaciers around Greenland, as well as progressive failure along a single calving front over the course of hours to days. Additionally, the presence of glacial earthquakes resolves a glacier's grounded state, as glacial earthquakes occur only when a glacier terminates close to its grounding line.

  19. Automatic Earthquake Detection by Active Learning

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  20. An empirical model for global earthquake fatality estimation

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David

    2010-01-01

    We analyzed mortality rates of earthquakes worldwide and developed a country/region-specific empirical model for earthquake fatality estimation within the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is defined as total killed divided by total population exposed at specific shaking intensity level. The total fatalities for a given earthquake are estimated by multiplying the number of people exposed at each shaking intensity level by the fatality rates for that level and then summing them at all relevant shaking intensities. The fatality rate is expressed in terms of a two-parameter lognormal cumulative distribution function of shaking intensity. The parameters are obtained for each country or a region by minimizing the residual error in hindcasting the total shaking-related deaths from earthquakes recorded between 1973 and 2007. A new global regionalization scheme is used to combine the fatality data across different countries with similar vulnerability traits.

  1. Earthquake hazard and risk assessment based on Unified Scaling Law for Earthquakes: Greater Caucasus and Crimea

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

    2018-05-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10 N(M, L) = A + B·(5 - M) + C·log10 L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within a seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g., peak ground acceleration, PGA, or macro-seismic intensity). After a rigorous verification against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory). The methodology of seismic hazard and risk assessment is illustrated by application to the territory of Greater Caucasus and Crimea.

  2. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes

    PubMed Central

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    Background A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities’ preparedness and response capabilities and to mitigate future consequences. Methods An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model’s algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. Results the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. Conclusion The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties. PMID:26959647

  3. Earthquake Clustering in Noisy Viscoelastic Systems

    NASA Astrophysics Data System (ADS)

    Dicaprio, C. J.; Simons, M.; Williams, C. A.; Kenner, S. J.

    2006-12-01

    Geologic studies show evidence for temporal clustering of earthquakes on certain fault systems. Since post- seismic deformation may result in a variable loading rate on a fault throughout the inter-seismic period, it is reasonable to expect that the rheology of the non-seismogenic lower crust and mantle lithosphere may play a role in controlling earthquake recurrence times. Previously, the role of rheology of the lithosphere on the seismic cycle had been studied with a one-dimensional spring-dashpot-slider model (Kenner and Simons [2005]). In this study we use the finite element code PyLith to construct a two-dimensional continuum model a strike-slip fault in an elastic medium overlying one or more linear Maxwell viscoelastic layers loaded in the far field by a constant velocity boundary condition. Taking advantage of the linear properties of the model, we use the finite element solution to one earthquake as a spatio-temporal Green's function. Multiple Green's function solutions, scaled by the size of each earthquake, are then summed to form an earthquake sequence. When the shear stress on the fault reaches a predefined yield stress it is allowed to slip, relieving all accumulated shear stress. Random variation in the fault yield stress from one earthquake to the next results in a temporally clustered earthquake sequence. The amount of clustering depends on a non-dimensional number, W, called the Wallace number. For models with one viscoelastic layer, W is equal to the standard deviation of the earthquake stress drop divided by the viscosity times the tectonic loading rate. This definition of W is modified from the original one used in Kenner and Simons [2005] by using the standard deviation of the stress drop instead of the mean stress drop. We also use a new, more appropriate, metric to measure the amount of temporal clustering of the system. W is the ratio of the viscoelastic relaxation rate of the system to the tectonic loading rate of the system. For values of

  4. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    NASA Astrophysics Data System (ADS)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  5. The Christchurch earthquake stroke incidence study.

    PubMed

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: Altai-Sayan Region

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2017-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10N(M, L) = A + B·(5 - M) + C·log10L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum credible magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g. peak ground acceleration, PGA, or macro-seismic intensity etc.). After a rigorous testing against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory, etc.). This, USLE based, methodology of seismic hazard and risks assessment is applied to the territory of Altai-Sayan Region, of Russia. The study supported by the Russian Science Foundation Grant No. 15-17-30020.

  7. Mass wasting triggered by the 5 March 1987 Ecuador earthquakes

    USGS Publications Warehouse

    Schuster, R.L.; Nieto, A.S.; O'Rourke, T. D.; Crespo, E.; Plaza-Nieto, G.

    1996-01-01

    On 5 March 1987, two earthquakes (Ms=6.1 and Ms=6.9) occurred about 25 km north of Reventador Volcano, along the eastern slopes of the Andes Mountains in northeastern Ecuador. Although the shaking damaged structures in towns and villages near the epicentral area, the economic and social losses directly due to earthquake shaking were small compared to the effects of catastrophic earthquake-triggered mass wasting and flooding. About 600 mm of rain fell in the region in the month preceding the earthquakes; thus, the surficial soils had high moisture contents. Slope failures commonly started as thin slides, which rapidly turned into fluid debris avalanches and debris flows. The surficial soils and thick vegetation covering them flowed down the slopes into minor tributaries and then were carried into major rivers. Rock and earth slides, debris avalanches, debris and mud flows, and resulting floods destroyed about 40 km of the Trans-Ecuadorian oil pipeline and the only highway from Quito to Ecuador's northeastern rain forests and oil fields. Estimates of total volume of earthquake-induced mass wastage ranged from 75-110 million m3. Economic losses were about US$ 1 billion. Nearly all of the approximately 1000 deaths from the earthquakes were a consequence of mass wasting and/ or flooding.

  8. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  9. Time-decreasing hazard and increasing time until the next earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corral, Alvaro

    2005-01-01

    The existence of a slowly always decreasing probability density for the recurrence times of earthquakes in the stationary case implies that the occurrence of an event at a given instant becomes more unlikely as time since the previous event increases. Consequently, the expected waiting time to the next earthquake increases with the elapsed time, that is, the event moves away fast to the future. We have found direct empirical evidence of this counterintuitive behavior in two worldwide catalogs as well as in diverse regional catalogs. Universal scaling functions describe the phenomenon well.

  10. Simulating Earthquake Early Warning Systems in the Classroom as a New Approach to Teaching Earthquakes

    NASA Astrophysics Data System (ADS)

    D'Alessio, M. A.

    2010-12-01

    A discussion of P- and S-waves seems an ubiquitous part of studying earthquakes in the classroom. Textbooks from middle school through university level typically define the differences between the waves and illustrate the sense of motion. While many students successfully memorize the differences between wave types (often utilizing the first letter as a memory aide), textbooks rarely give tangible examples of how the two waves would "feel" to a person sitting on the ground. One reason for introducing the wave types is to explain how to calculate earthquake epicenters using seismograms and travel time charts -- very abstract representations of earthquakes. Even when the skill is mastered using paper-and-pencil activities or one of the excellent online interactive versions, locating an epicenter simply does not excite many of our students because it evokes little emotional impact, even in students located in earthquake-prone areas. Despite these limitations, huge numbers of students are mandated to complete the task. At the K-12 level, California requires that all students be able to locate earthquake epicenters in Grade 6; in New York, the skill is a required part of the Regent's Examination. Recent innovations in earthquake early warning systems around the globe give us the opportunity to address the same content standard, but with substantially more emotional impact on students. I outline a lesson about earthquakes focused on earthquake early warning systems. The introductory activities include video clips of actual earthquakes and emphasize the differences between the way P- and S-waves feel when they arrive (P arrives first, but is weaker). I include an introduction to the principle behind earthquake early warning (including a summary of possible uses of a few seconds warning about strong shaking) and show examples from Japan. Students go outdoors to simulate P-waves, S-waves, and occupants of two different cities who are talking to one another on cell phones

  11. Earthquake precursors from InSAR geodesy: insights from the L'Aquila (Central Italy) April 6, 2009 earthquake

    NASA Astrophysics Data System (ADS)

    Bignami, C.; Moro, M.; Saroli, M.; Stramondo, S.; Albano, M.; Falcucci, E.; Gori, S.; Doglioni, C.; Polcari, M.; Tallini, M.; Macerola, L.; Novali, F.; Costantini, M.; Malvarosa, F.; Wegmüller, U.

    2017-12-01

    In modern seismology, the identification of earthquake precursors is one of the most important issue to investigate on. Precursor indicators based on the use of updated and most satellite advanced geodetic techniques such as GPS and SAR interferometry, have not been conclusively identified so far. However, the latest progress in terms of new satellite missions and processing algorithms may bring this goal closer. Here we present evidence of ground deformation signals preceding the 2009 L'Aquila earthquake, which have been observed using multi-temporal InSAR techniques. We exploited a wide dataset from RADARSAT2, ENVISAT and COSMO-SkyMed missions to derive mean velocity and ground acceleration maps of the epicentral area, for a time span of approximately 6 years before the earthquake and about one year after the earthquake. The maps of ground accelerations before the mainshock, have allowed the identification of two peculiar displacement patterns, well localized in two Quaternary basins, close to the focal volume of the seismic event (Mw 6.3) that hit the city of L'Aquila on 6 April 2009. In these two regions, a significant subsidence began approximately three years before the earthquake, reaching a value of about 1.5 cm, and persisted until the earthquake. Conversely, in the post-seismic phase, the two basins showed an uplift, with velocities approximately of 5 to 18 mm/yr. The deep knowledge of the geological, hydrogeological and geotechnical setting of the area has provided a plausible explanation of the observed phenomenon. The two Quaternary basins are filled with sediments that host multi-layer aquifers that are hydrologically connected with the neighbouring carbonatic hydrostructures. Before the earthquake, the rocks at depth have dilated and fractures opened. Consequently, fluids have migrated into the dilated volume causing the lowering the groundwater table in the carbonate hydrostructures and in the hydrologically connected multi-layer aquifers within the

  12. Influence of very-long-distance earthquakes on the ionosphere?

    NASA Astrophysics Data System (ADS)

    Liperovskaya, E. V.; Meister, C.-V.; Biagi, P.-F.; Liperovsky, V. A.; Rodkin, M. V.

    2009-04-01

    In the present work, variations of the critical frequency foF2 obtained every hour by the ionospheric sounding station Tashkent (41.3oN, 69.6oE) in the years 1964-1996 are considered. Mean values of data found at day-time between 11 LT and 16 LT are investigated. Disturbances of foF2 related to earthquakes are studied on the background of seasonal, geomagnetic, 11-years and 27-days solar variations. Normalized values F are used in the analysis, which are obtained excluding the seasonal run by subtracting the mean value of foF2 during the time interval of 14 days, from 7 days before the earthquake until seven days after the event, and dividing the result on its standard deviation. Days with high solar (Wolf number > 200) and geomagnetic (ΣKp > 25) disturbances are excluded from the analysis. Using the method of superposition of epoches it is concluded, that at the day of the earthquake the foF2 value decreases a) in case of earthquakes with magnitudes M > 6.5 at any place on the Earth, if the depth h of the epicentre satisfies h < 200 km, b) in connection with earthquakes with magnitudes 6.5 > M > 6.0 occurring in the Middle Asia region, if h < 70 km is satisfied, and c) in connection with earthquakes with magnitudes 6.0 > M > 5.5 appearing at a distance from Tashkent smaller than 1000 km if one has h < 70 km. In all investigated cases the reliability of the effect is larger than 95 %. The ratio of the number of earthquakes with a decrease of the foF2-value to the number of earthquakes where foF2 grows is about 2. The decrease of the foF2-value is also obtained some hours before and some hours - a day - after the event. Thus, one may assume that before an earthquake happening at a long distance, in the vicinity of the sounding station seismo-gravity waves with periods between half an hour and a few hours propagate through the earth's core. After long-distance earthquakes, seismic waves propagate in the vicinity of the sounding station. But in both cases, the

  13. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different

  14. Business closure and relocation: a comparative analysis of the Loma Prieta earthquake and Hurricane Andrew.

    PubMed

    Wasileski, Gabriela; Rodríguez, Havidán; Diaz, Walter

    2011-01-01

    The occurrence of a number of large-scale disasters or catastrophes in recent years, including the Indian Ocean tsunami (2004), the Kashmir earthquake (2005), Hurricane Katrina (2005) and Hurricane Ike (2008), have raised our awareness regarding the devastating effects of disasters on human populations and the importance of developing mitigation and preparedness strategies to limit the consequences of such events. However, there is still a dearth of social science research focusing on the socio-economic impact of disasters on businesses in the United States. This paper contributes to this research literature by focusing on the impact of disasters on business closure and relocation through the use of multivariate logistic regression models, specifically focusing on the Loma Prieta earthquake (1989) and Hurricane Andrew (1992). Using a multivariate model, we examine how physical damage to the infrastructure, lifeline disruption and business characteristics, among others, impact business closure and relocation following major disasters. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

  15. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  16. Development of Earthquake Emergency Response Plan for Tribhuvan International Airport, Kathmandu, Nepal

    DTIC Science & Technology

    2013-02-01

    Kathmandu, Nepal 5a. CONTRACT NUMBER W911NF-12-1-0282 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER ...In the past, big earthquakes in Nepal (see Figure 1.1) have caused a huge number of casualties and damage to structures. The Great Nepal -Bihar...UBC Earthquake Engineering Research Facility 2235 East Mall, Vancouver, BC, Canada V6T 1Z4 Phone : 604 822-6203 Fax: 604 822-6901 E-mail

  17. Linking giant earthquakes with the subduction of oceanic fracture zones

    NASA Astrophysics Data System (ADS)

    Landgrebe, T. C.; Müller, R. D.; EathByte Group

    2011-12-01

    Giant subduction earthquakes are known to occur in areas not previously identified as prone to high seismic risk. This highlights the need to better identify subduction zone segments potentially dominated by relatively long (up to 1000 years and more) recurrence times of giant earthquakes. Global digital data sets represent a promising source of information for a multi-dimensional earthquake hazard analysis. We combine the NGDC global Significant Earthquakes database with a global strain rate map, gridded ages of the ocean floor, and a recently produced digital data set for oceanic fracture zones, major aseismic ridges and volcanic chains to investigate the association of earthquakes as a function of magnitude with age of the downgoing slab and convergence rates. We use a so-called Top-N recommendation method, a technology originally developed to search, sort, classify, and filter very large and often statistically skewed data sets on the internet, to analyse the association of subduction earthquakes sorted by magnitude with key parameters. The Top-N analysis is used to progressively assess how strongly particular "tectonic niche" locations (e.g. locations along subduction zones intersected with aseismic ridges or volcanic chains) are associated with sets of earthquakes in sorted order in a given magnitude range. As the total number N of sorted earthquakes is increased, by progressively including smaller-magnitude events, the so-called recall is computed, defined as the number of Top-N earthquakes associated with particular target areas divided by N. The resultant statistical measure represents an intuitive description of the effectiveness of a given set of parameters to account for the location of significant earthquakes on record. We use this method to show that the occurrence of great (magnitude ≥ 8) earthquakes on overriding plate segments is strongly biased towards intersections of oceanic fracture zones with subduction zones. These intersection regions are

  18. Weather Satellite Thermal IR Responses Prior to Earthquakes

    NASA Technical Reports Server (NTRS)

    OConnor, Daniel P.

    2005-01-01

    A number of observers claim to have seen thermal anomalies prior to earthquakes, but subsequent analysis by others has failed to produce similar findings. What exactly are these anomalies? Might they be useful for earthquake prediction? It is the purpose of this study to determine if thermal anomalies can be found in association with known earthquakes by systematically co-registering weather satellite images at the sub-pixel level and then determining if statistically significant responses occurred prior to the earthquake event. A new set of automatic co-registration procedures was developed for this task to accommodate all properties particular to weather satellite observations taken at night, and it relies on the general condition that the ground cools after sunset. Using these procedures, we can produce a set of temperature-sensitive satellite images for each of five selected earthquakes (Algeria 2003; Bhuj, India 2001; Izmit, Turkey 2001; Kunlun Shan, Tibet 2001; Turkmenistan 2000) and thus more effectively investigate heating trends close to the epicenters a few hours prior to the earthquake events. This study will lay tracks for further work in earthquake prediction and provoke the question of the exact nature of the thermal anomalies.

  19. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

    USGS Publications Warehouse

    Coordinated by Bakun, William H.; Prescott, William H.

    1993-01-01

    Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

  20. Review of variations in Mw < 7 earthquake motions on position and tec (Mw = 6.5 aegean sea earthquake sample)

    NASA Astrophysics Data System (ADS)

    Yildirim, O.; Inyurt, S.; Mekik, C.

    2015-10-01

    Turkey is a country located in Middle Latitude zone and in which tectonic activity is intensive. Lastly, an earthquake of magnitude 6.5Mw occurred at Aegean Sea offshore on date 24 May 2014 at 12:25 UTC and it lasted approximately 40 s. The said earthquake was felt also in Greece, Romania and Bulgaria in addition to Turkey. In recent years seismic origin ionospheric anomaly detection studies have been done with TEC (Total Electron Contents) generated from GNSS (Global Navigation Satellite System) signals and the findings obtained have been revealed. In this study, TEC and positional variations have been examined seperately regarding the earthquake which occurred in the Aegean Sea. Then The correlation of the said ionospheric variation with the positional variation has been investigated. For this purpose, total fifteen stations have been used among which the data of four numbers of CORS-TR stations in the seismic zone (AYVL, CANA, IPSA, YENC) and IGS and EUREF stations are used. The ionospheric and positional variations of AYVL, CANA, IPSA and YENC stations have been examined by Bernese 5.0v software. When the (PPP-TEC) values produced as result of the analysis are examined, it has been understood that in the four stations located in Turkey, three days before the earthquake at 08:00 and 10:00 UTC, the TEC values were approximately 4 TECU above the upper limit TEC value. Still in the same stations, one day before the earthquake at 06:00, 08:00 and 10:00 UTC, it is being shown that the TEC values were approximately 5 TECU below the lower limit TEC value. On the other hand, the GIM-TEC values published by the CODE center have been examined. Still in all stations, it has been observed that three days before the earthquake the TEC values in the time portions of 08:00 and 10:00 UTC were approximately 2 TECU above, one day before the earthquake at 06:00, 08:00 and 10:00 UTC, the TEC values were approximately 4 TECU below the lower limit TEC value. Again, by using the same

  1. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  2. Investigating Lushan Earthquake Victims' Individual Behavior Response and Rescue Organization.

    PubMed

    Kang, Peng; Lv, Yipeng; Deng, Qiangyu; Liu, Yuan; Zhang, Yi; Liu, Xu; Zhang, Lulu

    2017-12-11

    Research concerning the impact of earthquake victims' individual behavior and its association with earthquake-related injuries is lacking. This study examined this relationship along with effectiveness of earthquake rescue measures. The six most severely destroyed townships during the Lushan earthquake were examined; 28 villages and three earthquake victims' settlement camp areas were selected as research areas. Inclusion criteria comprised living in Lushan county for a longtime, living in Lushan county during the 2013 Lushan earthquake, and having one's home destroyed. Earthquake victims with an intellectual disability or communication problems were excluded. The earthquake victims (N (number) = 5165, male = 2396) completed a questionnaire (response rate: 94.7%). Among them, 209 were injured (5.61%). Teachers (p < 0.0001, OR (odds ratios) = 3.33) and medical staff (p = 0.001, OR = 4.35) were more vulnerable to the earthquake than were farmers. Individual behavior was directly related to injuries, such as the first reaction after earthquake and fear. There is an obvious connection between earthquake-related injury and individual behavior characteristics. It is strongly suggested that victims receive mental health support from medical practitioners and the government to minimize negative effects. The initial reaction after an earthquake also played a vital role in victims' trauma; therefore, earthquake-related experience and education may prevent injuries. Self-aid and mutual help played key roles in emergency, medical rescue efforts.

  3. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  4. Landslides Triggered by the 2015 Gorkha, Nepal Earthquake

    NASA Astrophysics Data System (ADS)

    Xu, C.

    2018-04-01

    The 25 April 2015 Gorkha Mw 7.8 earthquake in central Nepal caused a large number of casualties and serious property losses, and also induced numerous landslides. Based on visual interpretation of high-resolution optical satellite images pre- and post-earthquake and field reconnaissance, we delineated 47,200 coseismic landslides with a total distribution extent more than 35,000 km2, which occupy a total area about 110 km2. On the basis of a scale relationship between landslide area (A) and volume (V), V = 1.3147 × A1.2085, the total volume of the coseismic landslides is estimated to be about 9.64 × 108 m3. Calculation yields that the landslide number density, area density, and volume density are 1.32 km-2, 0.31 %, and 0.027 m, respectively. The spatial distribution of these landslides is consistent with that of the mainshock and aftershocks and the inferred causative fault, indicating the effect of the earthquake energy release on the pattern on coseismic landslides. This study provides a new, more detailed and objective inventory of the landslides triggered by the Gorkha earthquake, which would be significant for further study of genesis of coseismic landslides, hazard assessment and the long-term impact of the slope failure on the geological environment in the earthquake-scarred region.

  5. A global outer-rise/outer-trench-slope (OR/OTS) earthquake study

    NASA Astrophysics Data System (ADS)

    Wartman, J. M.; Kita, S.; Kirby, S. H.; Choy, G. L.

    2009-12-01

    Using improved seismic, bathymetric, satellite gravity and other geophysical data, we investigated the seismicity patterns and focal mechanisms of earthquakes in oceanic lithosphere off the trenches of the world that are large enough to be well recorded at teleseismic distances. A number of prominent trends are apparent, some of which have been previously recognized based on more limited data [1], and some of which are largely new [2-5]: (1) The largest events and the highest seismicity rates tend to occur where Mesozoic incoming plates are subducting at high rates (e.g., those in the western Pacific and the Banda segment of Indonesia). The largest events are predominantly shallow normal faulting (SNF) earthquakes. Less common are reverse-faulting (RF) events that tend to be deeper and to be present along with SNF events where nearby seamounts, seamount chains and other volcanic features are subducting [Seno and Yamanaka, 1996]. Blooms of SNF OR/OTS events usually occur just after and seaward of great interplate thrust (IPT) earthquakes but are far less common after smaller IPT events. (2) Plates subducting at slow rates (<20 mm/a) often show sparse OR/OTS seismicity. It is unclear if such low activity is a long-term feature of these systems or is a consequence of the long return times of great IPT earthquakes (e.g., the sparse OR/OTS seismicity before the 26 December 2004 M9.2 Sumatra earthquake and many subsequent OR/OTS events). (3) OR/OTS shocks are generally sparse or absent where incoming plates are very young (<20 Ma) (e.g., Cascadia, southern Mexico, Nankai, and South Shetlands). (4) Subducting plates of intermediate age (20 to about 65 Ma) display a diversity of focal mechanisms and seismicity patterns. In the Philippines, NE Indonesia, and Melanesia, bands of reverse faulting events occur at or near the trench and SNF earthquakes are restricted to OR/OTS sites further from the trench. (5) Clustering of OR/OTS events of all types commonly occurs where

  6. Classification of Earthquake-triggered Landslide Events - Review of Classical and Particular Cases

    NASA Astrophysics Data System (ADS)

    Braun, A.; Havenith, H. B.; Schlögel, R.

    2016-12-01

    Seismically induced landslides often contribute to a significant degree to the losses related to earthquakes. The identification of possible extends of landslide affected areas can help to target emergency measures when an earthquake occurs or improve the resilience of inhabited areas and critical infrastructure in zones of high seismic hazard. Moreover, landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes in paleoseismic studies, allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. Inspired by classical reviews of earthquake induced landslides, e.g. by Keefer or Jibson, we present here a review of factors contributing to earthquake triggered slope failures based on an `event-by-event' classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, `Intensity', `Fault', `Topographic energy', `Climatic conditions' and `Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be crosschecked. We present cases where our prediction model performs well and discuss particular cases

  7. Simulation of rockfalls triggered by earthquakes

    USGS Publications Warehouse

    Kobayashi, Y.; Harp, E.L.; Kagawa, T.

    1990-01-01

    A computer program to simulate the downslope movement of boulders in rolling or bouncing modes has been developed and applied to actual rockfalls triggered by the Mammoth Lakes, California, earthquake sequence in 1980 and the Central Idaho earthquake in 1983. In order to reproduce a movement mode where bouncing predominated, we introduced an artificial unevenness to the slope surface by adding a small random number to the interpolated value of the mid-points between the adjacent surveyed points. Three hundred simulations were computed for each site by changing the random number series, which determined distances and bouncing intervals. The movement of the boulders was, in general, rather erratic depending on the random numbers employed, and the results could not be seen as deterministic but stochastic. The closest agreement between calculated and actual movements was obtained at the site with the most detailed and accurate topographic measurements. ?? 1990 Springer-Verlag.

  8. Distant, delayed and ancient earthquake-induced landslides

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Torgoev, Almaz; Braun, Anika; Schlögel, Romy; Micu, Mihai

    2016-04-01

    On the basis of a new classification of seismically induced landslides we outline particular effects related to the delayed and distant triggering of landslides. Those cannot be predicted by state-of-the-art methods. First, for about a dozen events the 'predicted' extension of the affected area is clearly underestimated. The most problematic cases are those for which far-distant triggering of landslides had been reported, such as for the 1988 Saguenay earthquake. In Central Asia reports for such cases are known for areas marked by a thick cover of loess. One possible contributing effect could be a low-frequency resonance of the thick soils induced by distant earthquakes, especially those in the Pamir - Hindu Kush seismic region. Such deep focal and high magnitude (>>7) earthquakes are also found in Europe, first of all in the Vrancea region (Romania). For this area and others in Central Asia we computed landslide event sizes related to scenario earthquakes with M>7.5. The second particular and challenging type of triggering is the one delayed with respect to the main earthquake event: case histories have been reported for the Racha earthquake in 1991 when several larger landslides only started moving 2 or 3 days after the main shock. Similar observations were also made after other earthquake events in the U.S., such as after the 1906 San Francisco, the 1949 Tacoma, the 1959 Hebgen Lake and the 1983 Bora Peak earthquakes. Here, we will present a series of detailed examples of (partly monitored) mass movements in Central Asia that mainly developed after earthquakes, some even several weeks after the main shock: e.g. the Tektonik and Kainama landslides triggered in 1992 and 2004, respectively. We believe that the development of the massive failures is a consequence of the opening of tension cracks during the seismic shaking and their filling up with water during precipitations that followed the earthquakes. The third particular aspect analysed here is the use of large

  9. NRIAG's Effort to Mitigate Earthquake Disasters in Egypt Using GPS and Seismic Data

    NASA Astrophysics Data System (ADS)

    Mahmoud, Salah

    It has been estimated that, during historical time more than 50 million people have lost their lives in earthquakes during ground shaking, such as soil amplification and/or liquefaction, landslides and tsunamis or its immediate aftereffects, as fires. The distribution of population takes generally no account of earthquake risk, at least on a large scale. An earthquake may be large but not destructive, on the other hand, an earthquake may be destructive but not large. The absence of correlation is due to the fact that, great number of other factors entering into consideration: first of all, the location of the earthquake in relation to populated areas, also soil conditions and building constructions. Soil liquefaction has been identified as the underlying phenomenon for many ground failures, settlements and lateral spreads, which are a major cause of damage to soil structures and building foundations in many events. Egypt is suffered a numerous of destructive earthquakes as well as Kalabsha earthquake (1981, Mag 5.4) near Aswan city and the High dam, Dahshour earthquake (1992, Mag 5.9) near Cairo city and Aqaba earthquake (1995, Mag 7.2). As the category of earthquake damage includes all the phenomena related to the direct and indirect damages, the Egyptian authorities do a great effort to mitigate the earthquake disasters. The seismicity especially at the zones of high activity is investigated in details in order to obtain the active source zones not only by the Egyptian National Seismic Network (ENSN) but also by the local seismic networks at, Aswan, Hurghada, Aqaba, Abu Dabbab and Dabbaa. On the other hand the soil condition, soil amplification, soil structure interaction, liquefaction and seismic hazard are carried out in particular the urbanized areas and the region near the source zones. All these parameters are integrated to obtain the Egyptian building code which is valid to construct buildings resist damages and consequently mitigate the earthquake

  10. Space geodetic tools provide early warnings for earthquakes and volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Aoki, Yosuke

    2017-04-01

    Development of space geodetic techniques such as Global Navigation Satellite System and Synthetic Aperture Radar in last few decades allows us to monitor deformation of Earth's surface in unprecedented spatial and temporal resolution. These observations, combined with fast data transmission and quick data processing, enable us to quickly detect and locate earthquakes and volcanic eruptions and assess potential hazards such as strong earthquake shaking, tsunamis, and volcanic eruptions. These techniques thus are key parts of early warning systems, help identify some hazards before a cataclysmic event, and improve the response to the consequent damage.

  11. Rescaled earthquake recurrence time statistics: application to microrepeaters

    NASA Astrophysics Data System (ADS)

    Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru

    2009-01-01

    Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.

  12. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    NASA Astrophysics Data System (ADS)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

  13. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  14. Application of Earthquake Subspace Detectors at Kilauea and Mauna Loa Volcanoes, Hawai`i

    NASA Astrophysics Data System (ADS)

    Okubo, P.; Benz, H.; Yeck, W.

    2016-12-01

    Recent studies have demonstrated the capabilities of earthquake subspace detectors for detailed cataloging and tracking of seismicity in a number of regions and settings. We are exploring the application of subspace detectors at the United States Geological Survey's Hawaiian Volcano Observatory (HVO) to analyze seismicity at Kilauea and Mauna Loa volcanoes. Elevated levels of microseismicity and occasional swarms of earthquakes associated with active volcanism here present cataloging challenges due the sheer numbers of earthquakes and an intrinsically low signal-to-noise environment featuring oceanic microseism and volcanic tremor in the ambient seismic background. With high-quality continuous recording of seismic data at HVO, we apply subspace detectors (Harris and Dodge, 2011, Bull. Seismol. Soc. Am., doi: 10.1785/0120100103) during intervals of noteworthy seismicity. Waveform templates are drawn from Magnitude 2 and larger earthquakes within clusters of earthquakes cataloged in the HVO seismic database. At Kilauea, we focus on seismic swarms in the summit caldera region where, despite continuing eruptions from vents in the summit region and in the east rift zone, geodetic measurements reflect a relatively inflated volcanic state. We also focus on seismicity beneath and adjacent to Mauna Loa's summit caldera that appears to be associated with geodetic expressions of gradual volcanic inflation, and where precursory seismicity clustered prior to both Mauna Loa's most recent eruptions in 1975 and 1984. We recover several times more earthquakes with the subspace detectors - down to roughly 2 magnitude units below the templates, based on relative amplitudes - compared to the numbers of cataloged earthquakes. The increased numbers of detected earthquakes in these clusters, and the ability to associate and locate them, allow us to infer details of the spatial and temporal distributions and possible variations in stresses within these key regions of the volcanoes.

  15. Pre-Earthquake Unipolar Electromagnetic Pulses

    NASA Astrophysics Data System (ADS)

    Scoville, J.; Freund, F.

    2013-12-01

    Transient ultralow frequency (ULF) electromagnetic (EM) emissions have been reported to occur before earthquakes [1,2]. They suggest powerful transient electric currents flowing deep in the crust [3,4]. Prior to the M=5.4 Alum Rock earthquake of Oct. 21, 2007 in California a QuakeFinder triaxial search-coil magnetometer located about 2 km from the epicenter recorded unusual unipolar pulses with the approximate shape of a half-cycle of a sine wave, reaching amplitudes up to 30 nT. The number of these unipolar pulses increased as the day of the earthquake approached. These pulses clearly originated around the hypocenter. The same pulses have since been recorded prior to several medium to moderate earthquakes in Peru, where they have been used to triangulate the location of the impending earthquakes [5]. To understand the mechanism of the unipolar pulses, we first have to address the question how single current pulses can be generated deep in the Earth's crust. Key to this question appears to be the break-up of peroxy defects in the rocks in the hypocenter as a result of the increase in tectonic stresses prior to an earthquake. We investigate the mechanism of the unipolar pulses by coupling the drift-diffusion model of semiconductor theory to Maxwell's equations, thereby producing a model describing the rock volume that generates the pulses in terms of electromagnetism and semiconductor physics. The system of equations is then solved numerically to explore the electromagnetic radiation associated with drift-diffusion currents of electron-hole pairs. [1] Sharma, A. K., P. A. V., and R. N. Haridas (2011), Investigation of ULF magnetic anomaly before moderate earthquakes, Exploration Geophysics 43, 36-46. [2] Hayakawa, M., Y. Hobara, K. Ohta, and K. Hattori (2011), The ultra-low-frequency magnetic disturbances associated with earthquakes, Earthquake Science, 24, 523-534. [3] Bortnik, J., T. E. Bleier, C. Dunson, and F. Freund (2010), Estimating the seismotelluric current

  16. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong.

  17. Earth's rotation variations and earthquakes 2010-2011

    NASA Astrophysics Data System (ADS)

    Ostřihanský, L.

    2012-01-01

    19 years earlier in difference only one day to 27 December 1985 earthquake, proving that not only sidereal 13.66 days variations but also that the 19 years Metons cycle is the period of the earthquakes occurrence. Histograms show the regular change of earthquake positions on branches of LOD graph and also the shape of histogram and number of earthquakes on LOD branches from the mid-ocean ridge can show which side of the ridge moves quicker.

  18. New insights into earthquake precursors from InSAR.

    PubMed

    Moro, Marco; Saroli, Michele; Stramondo, Salvatore; Bignami, Christian; Albano, Matteo; Falcucci, Emanuela; Gori, Stefano; Doglioni, Carlo; Polcari, Marco; Tallini, Marco; Macerola, Luca; Novali, Fabrizio; Costantini, Mario; Malvarosa, Fabio; Wegmüller, Urs

    2017-09-20

    We measured ground displacements before and after the 2009 L'Aquila earthquake using multi-temporal InSAR techniques to identify seismic precursor signals. We estimated the ground deformation and its temporal evolution by exploiting a large dataset of SAR imagery that spans seventy-two months before and sixteen months after the mainshock. These satellite data show that up to 15 mm of subsidence occurred beginning three years before the mainshock. This deformation occurred within two Quaternary basins that are located close to the epicentral area and are filled with sediments hosting multi-layer aquifers. After the earthquake, the same basins experienced up to 12 mm of uplift over approximately nine months. Before the earthquake, the rocks at depth dilated, and fractures opened. Consequently, fluids migrated into the dilated volume, thereby lowering the groundwater table in the carbonate hydrostructures and in the hydrologically connected multi-layer aquifers within the basins. This process caused the elastic consolidation of the fine-grained sediments within the basins, resulting in the detected subsidence. After the earthquake, the fractures closed, and the deep fluids were squeezed out. The pre-seismic ground displacements were then recovered because the groundwater table rose and natural recharge of the shallow multi-layer aquifers occurred, which caused the observed uplift.

  19. Sensitivity analysis of earthquake-induced static stress changes on volcanoes: the 2010 Mw 8.8 Chile earthquake

    NASA Astrophysics Data System (ADS)

    Bonali, F. L.; Tibaldi, A.; Corazzato, C.

    2015-06-01

    In this work, we analyse in detail how a large earthquake could cause stress changes on volcano plumbing systems and produce possible positive feedbacks in promoting new eruptions. We develop a sensitivity analysis that considers several possible parameters, providing also new constraints on the methodological approach. The work is focus on the Mw 8.8 2010 earthquake that occurred along the Chile subduction zone near 24 historic/Holocene volcanoes, located in the Southern Volcanic Zone. We use six different finite fault-slip models to calculate the static stress change, induced by the coseismic slip, in a direction normal to several theoretical feeder dykes with various orientations. Results indicate different magnitudes of stress change due to the heterogeneity of magma pathway geometry and orientation. In particular, the N-S and NE-SW-striking magma pathways suffer a decrease in stress normal to the feeder dyke (unclamping, up to 0.85 MPa) in comparison to those striking NW-SE and E-W, and in some cases there is even a clamping effect depending on the magma path strike. The diverse fault-slip models have also an effect (up to 0.4 MPa) on the results. As a consequence, we reconstruct the geometry and orientation of the most reliable magma pathways below the 24 volcanoes by studying structural and morphometric data, and we resolve the stress changes on each of them. Results indicate that: (i) volcanoes where post-earthquake eruptions took place experienced earthquake-induced unclamping or very small clamping effects, (ii) several volcanoes that did not erupt yet are more prone to experience future unrest, from the point of view of the host rock stress state, because of earthquake-induced unclamping. Our findings also suggest that pathway orientation plays a more relevant role in inducing stress changes, whereas the depth of calculation (e.g. 2, 5 or 10 km) used in the analysis, is not key a parameter. Earthquake-induced magma-pathway unclamping might contribute to

  20. Nuclear explosions and distant earthquakes: A search for correlations

    USGS Publications Warehouse

    Healy, J.H.; Marshall, P.A.

    1970-01-01

    An apparent correlation between nuclear explosions and earthquakes has been reported for the events between September 1961 and September 1966. When data from the events between September 1966 and December 1968 are examined, this correlation disappears. No relationship between the size of the nuclear explosions and the number of distant earthquakes is apparent in the data.

  1. What is the earthquake fracture energy?

    NASA Astrophysics Data System (ADS)

    Di Toro, G.; Nielsen, S. B.; Passelegue, F. X.; Spagnuolo, E.; Bistacchi, A.; Fondriest, M.; Murphy, S.; Aretusini, S.; Demurtas, M.

    2016-12-01

    The energy budget of an earthquake is one of the main open questions in earthquake physics. During seismic rupture propagation, the elastic strain energy stored in the rock volume that bounds the fault is converted into (1) gravitational work (relative movement of the wall rocks bounding the fault), (2) in- and off-fault damage of the fault zone rocks (due to rupture propagation and frictional sliding), (3) frictional heating and, of course, (4) seismic radiated energy. The difficulty in the budget determination arises from the measurement of some parameters (e.g., the temperature increase in the slipping zone which constraints the frictional heat), from the not well constrained size of the energy sinks (e.g., how large is the rock volume involved in off-fault damage?) and from the continuous exchange of energy from different sinks (for instance, fragmentation and grain size reduction may result from both the passage of the rupture front and frictional heating). Field geology studies, microstructural investigations, experiments and modelling may yield some hints. Here we discuss (1) the discrepancies arising from the comparison of the fracture energy measured in experiments reproducing seismic slip with the one estimated from seismic inversion for natural earthquakes and (2) the off-fault damage induced by the diffusion of frictional heat during simulated seismic slip in the laboratory. Our analysis suggests, for instance, that the so called earthquake fracture energy (1) is mainly frictional heat for small slips and (2), with increasing slip, is controlled by the geometrical complexity and other plastic processes occurring in the damage zone. As a consequence, because faults are rapidly and efficiently lubricated upon fast slip initiation, the dominant dissipation mechanism in large earthquakes may not be friction but be the off-fault damage due to fault segmentation and stress concentrations in a growing region around the fracture tip.

  2. Earthquakes for Kids

    MedlinePlus

    ... across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History A scientist stands in ...

  3. Return to work for severely injured survivors of the Christchurch earthquake: influences in the first 2 years.

    PubMed

    Nunnerley, Joanne; Dunn, Jennifer; McPherson, Kathryn; Hooper, Gary; Woodfield, Tim

    2016-01-01

    This study looked at the influences on the return to work (RTW) in the first 2 years for people severely injured in the 22 February 2011 Christchurch earthquake. We used a constructivist grounded theory approach using semi-structured interviews to collect data from 14 people injured in the earthquake. Analysis elicited three themes that appeared to influence the process of RTW following the Christchurch earthquake. Living the earthquake experience, the individual's experiences of the earthquake and how their injury framed their expectations; rebuilding normality, the desire of the participants to return to life as it was; while dealing with the secondary effects of the earthquake includes the earthquake specific effects which were both barriers and facilitators to returning to work. The consequences of the earthquake impacted on experience, process and outcome of RTW for those injured in the Christchurch Earthquake. Work and RTW appeared key tools to enhance recovery after serious injury following the earthquake. The altered physical, social and economic environment must be considered when working on the return to work (RTW) of individuals with earthquake injuries. Providing tangible emotional and social support so injured earthquake survivors feel safe in their workplace may facilitate RTW. Engaging early with employers may assist the RTW of injured earthquake survivors.

  4. Understanding intraplate earthquakes in Sweden: the where and why

    NASA Astrophysics Data System (ADS)

    Lund, Björn; Tryggvason, Ari; Chan, NeXun; Högdahl, Karin; Buhcheva, Darina; Bödvarsson, Reynir

    2016-04-01

    The Swedish National Seismic Network (SNSN) underwent a rapid expansion and modernization between the years 2000 - 2010. The number of stations increased from 6 to 65, all broadband or semi-broadband with higher than standard sensitivity and all transmitting data in real-time. This has lead to a significant increase in the number of detected earthquakes, with the magnitude of completeness being approximately ML 0.5 within the network. During the last 15 years some 7,300 earthquakes have been detected and located, which can be compared to the approximately 1,800 earthquakes in the Swedish catalog from 1375 to 1999. We have used the recent earthquake catalog and various antropogenic sources (e.g. mine blasts, quarry blasts and infrastructure construction blast) to derive low resolution 3D P- and S-wave velocity models for entire Sweden. Including the blasts provides a more even geographical distribution of sources as well as good constraints on the locations. The resolution of the derived velocity models is in the 20 km range in the well resolved areas. A fairly robust feature observed in the Vp/Vs ratio of the derived models is a difference between the Paleoproterozoic rocks belonging to the TIB (Transscanidinavian Igneous Belt) and the Svecofennian rocks east and north of this region (a Vp/Vs ratio about 1.72 prevail in the former compared to a value below 1.70 in the latter) at depths down to 15 km. All earthquakes occurring since 2000 have been relocated in the 3D velocity model. The results show very clear differences in how earthquakes occur in different parts of Sweden. In the north, north of approximately 64 degrees latitude, most earthquakes occur on or in the vicinity of the Holocene postglacial faults. From 64N to approximately 60N earthquake activity is concentrated along the northeast coast line, with some relation to the offset in the bedrock from the onshore area to the offshore Bay of Bothnia. In southern Sweden earthquake activity is more widely

  5. Assessing Earthquake-Induced Tree Mortality in Temperate Forest Ecosystems: A Case Study from Wenchuan, China

    DOE PAGES

    Zeng, Hongcheng; Lu, Tao; Jenkins, Hillary; ...

    2016-03-17

    Earthquakes can produce significant tree mortality, and consequently affect regional carbon dynamics. Unfortunately, detailed studies quantifying the influence of earthquake on forest mortality are currently rare. The committed forest biomass carbon loss associated with the 2008 Wenchuan earthquake in China is assessed by a synthetic approach in this study that integrated field investigation, remote sensing analysis, empirical models and Monte Carlo simulation. The newly developed approach significantly improved the forest disturbance evaluation by quantitatively defining the earthquake impact boundary and detailed field survey to validate the mortality models. Based on our approach, a total biomass carbon of 10.9 Tg·C wasmore » lost in Wenchuan earthquake, which offset 0.23% of the living biomass carbon stock in Chinese forests. Tree mortality was highly clustered at epicenter, and declined rapidly with distance away from the fault zone. It is suggested that earthquakes represent a signif icant driver to forest carbon dynamics, and the earthquake-induced biomass carbon loss should be included in estimating forest carbon budgets.« less

  6. Assessing Earthquake-Induced Tree Mortality in Temperate Forest Ecosystems: A Case Study from Wenchuan, China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeng, Hongcheng; Lu, Tao; Jenkins, Hillary

    Earthquakes can produce significant tree mortality, and consequently affect regional carbon dynamics. Unfortunately, detailed studies quantifying the influence of earthquake on forest mortality are currently rare. The committed forest biomass carbon loss associated with the 2008 Wenchuan earthquake in China is assessed by a synthetic approach in this study that integrated field investigation, remote sensing analysis, empirical models and Monte Carlo simulation. The newly developed approach significantly improved the forest disturbance evaluation by quantitatively defining the earthquake impact boundary and detailed field survey to validate the mortality models. Based on our approach, a total biomass carbon of 10.9 Tg·C wasmore » lost in Wenchuan earthquake, which offset 0.23% of the living biomass carbon stock in Chinese forests. Tree mortality was highly clustered at epicenter, and declined rapidly with distance away from the fault zone. It is suggested that earthquakes represent a signif icant driver to forest carbon dynamics, and the earthquake-induced biomass carbon loss should be included in estimating forest carbon budgets.« less

  7. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    NASA Astrophysics Data System (ADS)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  8. Significance of stress transfer in time-dependent earthquake probability calculations

    USGS Publications Warehouse

    Parsons, T.

    2005-01-01

    A sudden change in stress is seen to modify earthquake rates, but should it also revise earthquake probability? Data used to derive input parameters permits an array of forecasts; so how large a static stress change is require to cause a statistically significant earthquake probability change? To answer that question, effects of parameter and philosophical choices are examined through all phases of sample calculations, Drawing at random from distributions of recurrence-aperiodicity pairs identifies many that recreate long paleoseismic and historic earthquake catalogs. Probability density funtions built from the recurrence-aperiodicity pairs give the range of possible earthquake forecasts under a point process renewal model. Consequences of choices made in stress transfer calculations, such as different slip models, fault rake, dip, and friction are, tracked. For interactions among large faults, calculated peak stress changes may be localized, with most of the receiving fault area changed less than the mean. Thus, to avoid overstating probability change on segments, stress change values should be drawn from a distribution reflecting the spatial pattern rather than using the segment mean. Disparity resulting from interaction probability methodology is also examined. For a fault with a well-understood earthquake history, a minimum stress change to stressing rate ratio of 10:1 to 20:1 is required to significantly skew probabilities with >80-85% confidence. That ratio must be closer to 50:1 to exceed 90-95% confidence levels. Thus revision to earthquake probability is achievable when a perturbing event is very close to the fault in question or the tectonic stressing rate is low.

  9. Populating the Advanced National Seismic System Comprehensive Earthquake Catalog

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Perry, M. R.; Andrews, J. R.; Withers, M. M.; Hellweg, M.; Kim, W. Y.; Shiro, B.; West, M. E.; Storchak, D. A.; Pankow, K. L.; Huerfano Moreno, V. A.; Gee, L. S.; Wolfe, C. J.

    2016-12-01

    The U.S. Geological Survey maintains a repository of earthquake information produced by networks in the Advanced National Seismic System with additional data from the ISC-GEM catalog and many non-U.S. networks through their contributions to the National Earthquake Information Center PDE bulletin. This Comprehensive Catalog (ComCat) provides a unified earthquake product while preserving attribution and contributor information. ComCat contains hypocenter and magnitude information with supporting phase arrival-time and amplitude measurements (when available). Higher-level products such as focal mechanisms, earthquake slip models, "Did You Feel It?" reports, ShakeMaps, PAGER impact estimates, earthquake summary posters, and tectonic summaries are also included. ComCat is updated as new events are processed and the catalog can be accesed at http://earthquake.usgs.gov/earthquakes/search/. Throughout the past few years, a concentrated effort has been underway to expand ComCat by integrating global and regional historic catalogs. The number of earthquakes in ComCat has more than doubled in the past year and it presently contains over 1.6 million earthquake hypocenters. We will provide an overview of catalog contents and a detailed description of numerous tools and semi-automated quality-control procedures developed to uncover errors including systematic magnitude biases, missing time periods, duplicate postings for the same events, and incorrectly associated events.

  10. The Great East-Japan Earthquake and devastating tsunami: an update and lessons from the past Great Earthquakes in Japan since 1923.

    PubMed

    Ishigaki, Akemi; Higashi, Hikari; Sakamoto, Takako; Shibahara, Shigeki

    2013-04-01

    Japan has a long history of fighting against great earthquakes that cause structural damage/collapses, fires and/or tsunami. On March 11, 2011 at 14:46 (Friday), the Great East-Japan Earthquake (magnitude 9.0) attacked the Tohoku region (northeastern Japan), which includes Sendai City. The earthquake generated a devastating tsunami, leading to unprecedented disasters (~18,500 victims) in coastal areas of Iwate, Miyagi and Fukushima prefectures, despite the fact that people living in the Tohoku region are well trained for tsunami-evacuation procedures, with the mindset of "Tsunami, ten-den-ko." This code means that each person should evacuate individually upon an earthquake. Sharing this rule, children and parents can escape separately from schools, houses or workplaces, without worrying about each other. The concept of ten-den-ko (individual evacuation) is helpful for people living in coastal areas of earthquake-prone zones around the world. It is also important to construct safe evacuation centers, because the March 11(th) tsunami killed people who had evacuated to evacuation sites. We summarize the current conditions of people living in the disaster-stricken areas, including the consequences of the Fukushima nuclear accident. We also describe the disaster responses as the publisher of the Tohoku Journal of Experimental Medicine (TJEM), located in Sendai, with online support from Tokyo. In 1923, the Great Kanto Earthquake (magnitude 7.9) evoked a massive fire that destroyed large areas of Tokyo (~105,000 victims), including the print company for TJEM, but the Wistar Institute printed three TJEM issues in 1923 in Philadelphia. Mutual aid relationships should be established between distant cities to survive future disasters.

  11. Some comparisons between mining-induced and laboratory earthquakes

    USGS Publications Warehouse

    McGarr, A.

    1994-01-01

    Although laboratory stick-slip friction experiments have long been regarded as analogs to natural crustal earthquakes, the potential use of laboratory results for understanding the earthquake source mechanism has not been fully exploited because of essential difficulties in relating seismographic data to measurements made in the controlled laboratory environment. Mining-induced earthquakes, however, provide a means of calibrating the seismic data in terms of laboratory results because, in contrast to natural earthquakes, the causative forces as well as the hypocentral conditions are known. A comparison of stick-slip friction events in a large granite sample with mining-induced earthquakes in South Africa and Canada indicates both similarities and differences between the two phenomena. The physics of unstable fault slip appears to be largely the same for both types of events. For example, both laboratory and mining-induced earthquakes have very low seismic efficiencies {Mathematical expression} where ??a is the apparent stress and {Mathematical expression} is the average stress acting on the fault plane to cause slip; nearly all of the energy released by faulting is consumed in overcoming friction. In more detail, the mining-induced earthquakes differ from the laboratory events in the behavior of ?? as a function of seismic moment M0. Whereas for the laboratory events ?????0.06 independent of M0, ?? depends quite strongly on M0 for each set of induced earthquakes, with 0.06 serving, apparently, as an upper bound. It seems most likely that this observed scaling difference is due to variations in slip distribution over the fault plane. In the laboratory, a stick-slip event entails homogeneous slip over a fault of fixed area. For each set of induced earthquakes, the fault area appears to be approximately fixed but the slip is inhomogeneous due presumably to barriers (zones of no slip) distributed over the fault plane; at constant {Mathematical expression}, larger

  12. Nurse willingness to report for work in the event of an earthquake in Israel.

    PubMed

    Ben Natan, Merav; Nigel, Simon; Yevdayev, Innush; Qadan, Mohamad; Dudkiewicz, Mickey

    2014-10-01

    To examine variables affecting nurse willingness to report for work in the event of an earthquake in Israel and whether this can be predicted through the Theory of Self-Efficacy. The nursing profession has a major role in preparing for earthquakes. Nurse willingness to report to work in the event of an earthquake has never before been examined. Self-administered questionnaires were distributed among a convenience sample of 400 nurses and nursing students in Israel during January-April 2012. High willingness to report to work in the event of an earthquake was declared by 57% of respondents. High perceived self-efficacy, level of knowledge and experience predict willingness to report to work in the event of an earthquake. Multidisciplinary collaboration and support was also cited as a meaningful factor. Perceived self-efficacy, level of knowledge, experience and the support of a multidisciplinary staff affect nurse willingness to report to work in the event of an earthquake. Nurse managers can identify factors that increase nurse willingness to report to work in the event of an earthquake and consequently develop strategies for more efficient management of their nursing workforce. © 2013 John Wiley & Sons Ltd.

  13. Global Instrumental Seismic Catalog: earthquake relocations for 1900-present

    NASA Astrophysics Data System (ADS)

    Villasenor, A.; Engdahl, E.; Storchak, D. A.; Bondar, I.

    2010-12-01

    We present the current status of our efforts to produce a set of homogeneous earthquake locations and improved focal depths towards the compilation of a Global Catalog of instrumentally recorded earthquakes that will be complete down to the lowest magnitude threshold possible on a global scale and for the time period considered. This project is currently being carried out under the auspices of GEM (Global Earthquake Model). The resulting earthquake catalog will be a fundamental dataset not only for earthquake risk modeling and assessment on a global scale, but also for a large number of studies such as global and regional seismotectonics; the rupture zones and return time of large, damaging earthquakes; the spatial-temporal pattern of moment release along seismic zones and faults etc. Our current goal is to re-locate all earthquakes with available station arrival data using the following magnitude thresholds: M5.5 for 1964-present, M6.25 for 1918-1963, M7.5 (complemented with significant events in continental regions) for 1900-1917. Phase arrival time data for earthquakes after 1963 are available in digital form from the International Seismological Centre (ISC). For earthquakes in the time period 1918-1963, phase data is obtained by scanning the printed International Seismological Summary (ISS) bulletins and applying optical character recognition routines. For earlier earthquakes we will collect phase data from individual station bulletins. We will illustrate some of the most significant results of this relocation effort, including aftershock distributions for large earthquakes, systematic differences in epicenter and depth with respect to previous location, examples of grossly mislocated events, etc.

  14. Earthquake-induced burial of archaeological sites along the southern Washington coast about A.D. 1700

    USGS Publications Warehouse

    Cole, S.C.; Atwater, B.F.; McCutcheon, P.T.; Stein, J.K.; Hemphill-Haley, E.

    1996-01-01

    Although inhabited by thousands of people when first reached by Europeans, the Pacific coast of southern Washington has little recognized evidence of prehistoric human occupation. This apparent contradiction may be explained partly by geologic evidence for coastal submergence during prehistoric earthquakes on the Cascadia subduction zone. Recently discovered archaeological sites, exposed in the banks of two tidal streams, show evidence for earthquake-induced submergence and consequent burial by intertidal mud about A.D. 1700. We surmise that, because of prehistoric earthquakes, other archaeological sites may now lie hidden beneath the surfaces of modern tidelands. Such burial of archaeological sites raises questions about the estimation of prehistoric human population densities along coasts subject to earthquake-induced submergence. ?? 1996 John Wiley & Sons, Inc.

  15. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  16. Global assessment of human losses due to earthquakes

    USGS Publications Warehouse

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  17. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  18. Are you prepared for the next big earthquake in Alaska?

    USGS Publications Warehouse

    2006-01-01

    Scientists have long recognized that Alaska has more earthquakes than any other region of the United States and is, in fact, one of the most seismically active areas of the world. The second-largest earthquake ever recorded shook the heart of southern Alaska on March 27th, 1964. The largest strike-slip slip earthquake in North America in almost 150 years occurred on the Denali Fault in central Alaska on November 3rd, 2002. “Great” earthquakes (larger than magnitude 8) have rocked the state on an average of once every 13 years since 1900. It is only a matter of time before another major earthquake will impact a large number of Alaskans.Alaska has changed significantly since the damaging 1964 earthquake, and the population has more than doubled. Many new buildings are designed to withstand intense shaking, some older buildings have been reinforced, and development has been discouraged in some particularly hazardous areas. Despite these precautions, future earthquakes may still cause damage to buildings, displace items within buildings, and disrupt the basic utilities that we take for granted. We must take every reasonable action to prepare for damaging earthquakes in order to lower these risks.

  19. Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-12-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.

  20. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    NASA Astrophysics Data System (ADS)

    Wyss, Max

    2013-04-01

    being incorrect for scientific reasons and here I argue that it is also ineffective for psychological reasons. Instead of calming the people or by underestimating the hazard in strongly active areas by the GSHAP approach, they should be told quantitatively the consequences of the reasonably worst case and be motivated to prepare for it, whether or not it may hit the present or the next generation. In a worst case scenario for L'Aquila, the number of expected fatalities and injured should have been calculated for an event in the range of M6.5 to M7, as I did for a civil defense exercise in Umbria, Italy. With the prospect that approximately 500 people may die in an earthquake in the immediate or distant future, some residents might have built themselves an earthquake closet (similar to a simple tornado shelter) in a corner of their apartment, into which they might have dashed to safety at the onset of the P-wave before the destructive S-wave arrived. I conclude that in earthquake prone areas quantitative loss estimates due to a reasonable worst case earthquake should replace probabilistic hazard and risk estimates. This is a service, which experts owe the community. Insurance companies and academics may still find use for probabilistic estimates of losses, especially in areas of low seismic hazard, where the worst case scenario approach is less appropriate.

  1. PERFORMANCE OF AN EARTHQUAKE EXCITED ROOF DIAPHRAGM.

    USGS Publications Warehouse

    Celebi, M.; Brady, G.; Safak, E.; Converse, A.; ,

    1986-01-01

    The objective of this paper is to study the earthquake performance of the roof diaphragm of the West Valley College gymnasium in Saratoga, California through a complete set of acceleration records obtained during the 24 April 1984 Morgan Hill Earthquake (M equals 6. 1). The roof diaphragm of the 112 ft. multiplied by 144 ft. rectangular, symmetric gymnasium consists of 3/8 in. plywood over tongue-and-groove sheathing attached to steel trusses supported by reinforced concrete columns and walls. Three sensors placed in the direction of each of the axes of the diaphragm facilitate the evaluation of in-plane deformation of the diaphragm. Other sensors placed at ground level measure vertical and horizontal motion of the building floor, and consequently allow the calculation of the relative motion of the diaphragm with respect to the ground level.

  2. U.S. Geological Survey (USGS) Earthquake Web Applications

    NASA Astrophysics Data System (ADS)

    Fee, J.; Martinez, E.

    2015-12-01

    USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/

  3. Earthquakes and building design: a primer for the laboratory animal professional.

    PubMed

    Vogelweid, Catherine M; Hill, James B; Shea, Robert A; Johnson, Daniel B

    2005-01-01

    Earthquakes can occur in most regions of the United States, so it might be necessary to reinforce vulnerable animal facilities to better protect research animals during these unpredictable events. A risk analysis should include an evaluation of the seismic hazard risk at the proposed building site balanced against the estimated consequences of losses. Risk analysis can help in better justifying and recommending to building owners the costs of incorporating additional seismic reinforcements. The planning team needs to specify the level of post-earthquake building function that is desired in the facility, and then design the facility to it.

  4. A prospective earthquake forecast experiment for Japan

    NASA Astrophysics Data System (ADS)

    Yokoi, Sayoko; Nanjo, Kazuyoshi; Tsuruoka, Hiroshi; Hirata, Naoshi

    2013-04-01

    One major focus of the current Japanese earthquake prediction research program (2009-2013) is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. On 1 November in 2009, we started the 1st earthquake forecast testing experiment for the Japan area. We use the unified JMA catalogue compiled by the Japan Meteorological Agency as authorized catalogue. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called All Japan, Mainland, and Kanto. A total of 91 models were submitted to CSEP-Japan, and are evaluated with the CSEP official suite of tests about forecast performance. In this presentation, we show the results of the experiment of the 3-month testing class for 5 rounds. HIST-ETAS7pa, MARFS and RI10K models corresponding to the All Japan, Mainland and Kanto regions showed the best score based on the total log-likelihood. It is also clarified that time dependency of model parameters is no effective factor to pass the CSEP consistency tests for the 3-month testing class in all regions. Especially, spatial distribution in the All Japan region was too difficult to pass consistency test due to multiple events at a bin. Number of target events for a round in the Mainland region tended to be smaller than model's expectation during all rounds, which resulted in rejections of consistency test because of overestimation. In the Kanto region, pass ratios of consistency tests in each model showed more than 80%, which was associated with good balanced forecasting of event

  5. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  6. Fractal dynamics of earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bak, P.; Chen, K.

    1995-05-01

    Many objects in nature, from mountain landscapes to electrical breakdown and turbulence, have a self-similar fractal spatial structure. It seems obvious that to understand the origin of self-similar structures, one must understand the nature of the dynamical processes that created them: temporal and spatial properties must necessarily be completely interwoven. This is particularly true for earthquakes, which have a variety of fractal aspects. The distribution of energy released during earthquakes is given by the Gutenberg-Richter power law. The distribution of epicenters appears to be fractal with dimension D {approx} 1--1.3. The number of after shocks decay as a function ofmore » time according to the Omori power law. There have been several attempts to explain the Gutenberg-Richter law by starting from a fractal distribution of faults or stresses. But this is a hen-and-egg approach: to explain the Gutenberg-Richter law, one assumes the existence of another power-law--the fractal distribution. The authors present results of a simple stick slip model of earthquakes, which evolves to a self-organized critical state. Emphasis is on demonstrating that empirical power laws for earthquakes indicate that the Earth`s crust is at the critical state, with no typical time, space, or energy scale. Of course the model is tremendously oversimplified; however in analogy with equilibrium phenomena they do not expect criticality to depend on details of the model (universality).« less

  7. Land subsidence of clay deposits after the Tohoku-Pacific Ocean Earthquake

    NASA Astrophysics Data System (ADS)

    Yasuhara, K.; Kazama, M.

    2015-11-01

    Extensive infrastructure collapse resulted from the cataclysmic earthquake that struck off the eastern coast of Japan on 11 March 2011 and from its consequent gigantic tsunami, affecting not only the Tohoku region but also the Kanto region. Among the geological and geotechnical processes observed, land subsidence occurring in both coastal and inland areas and from Tohoku to Kanto is an extremely important issue that must be examined carefully. This land subsidence is classifiable into three categories: (i) land sinking along the coastal areas because of tectonic movements, (ii) settlement of sandy deposits followed by liquefaction, and (iii) long-term post-earthquake recompression settlement in soft clay caused by dissipation of excess pore pressure. This paper describes two case histories of post-earthquake settlement of clay deposits from among the three categories of ground sinking and land subsidence because such settlement has been frequently overlooked in numerous earlier earthquakes. Particularly, an attempt is made to propose a methodology for predicting such settlement and for formulating remedial or responsive measures to mitigate damage from such settlement.

  8. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    NASA Astrophysics Data System (ADS)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  9. Robust method to detect and locate local earthquakes by means of amplitude measurements.

    NASA Astrophysics Data System (ADS)

    del Puy Papí Isaba, María; Brückl, Ewald

    2016-04-01

    In this study we present a robust new method to detect and locate medium and low magnitude local earthquakes. This method is based on an empirical model of the ground motion obtained from amplitude data of earthquakes in the area of interest, which were located using traditional methods. The first step of our method is the computation of maximum resultant ground velocities in sliding time windows covering the whole period of interest. In the second step, these maximum resultant ground velocities are back-projected to every point of a grid covering the whole area of interest while applying the empirical amplitude - distance relations. We refer to these back-projected ground velocities as pseudo-magnitudes. The number of operating seismic stations in the local network equals the number of pseudo-magnitudes at each grid-point. Our method introduces the new idea of selecting the minimum pseudo-magnitude at each grid-point for further analysis instead of searching for a minimum of the L2 or L1 norm. In case no detectable earthquake occurred, the spatial distribution of the minimum pseudo-magnitudes constrains the magnitude of weak earthquakes hidden in the ambient noise. In the case of a detectable local earthquake, the spatial distribution of the minimum pseudo-magnitudes shows a significant maximum at the grid-point nearest to the actual epicenter. The application of our method is restricted to the area confined by the convex hull of the seismic station network. Additionally, one must ensure that there are no dead traces involved in the processing. Compared to methods based on L2 and even L1 norms, our new method is almost wholly insensitive to outliers (data from locally disturbed seismic stations). A further advantage is the fast determination of the epicenter and magnitude of a seismic event located within a seismic network. This is possible due to the method of obtaining and storing a back-projected matrix, independent of the registered amplitude, for each seismic

  10. Long-term predictability of regions and dates of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    Results on the long-term predictability of strong earthquakes are discussed. It is shown that dates of earthquakes with M>5.5 could be determined in advance of several months before the event. The magnitude and the region of approaching earthquake could be specified in the time-frame of a month before the event. Determination of number of M6+ earthquakes, which are expected to occur during the analyzed year, is performed using the special sequence diagram of seismic activity for the century time frame. Date analysis could be performed with advance of 15-20 years. Data is verified by a monthly sequence diagram of seismic activity. The number of strong earthquakes expected to occur in the analyzed month is determined by several methods having a different prediction horizon. Determination of days of potential earthquakes with M5.5+ is performed using astronomical data. Earthquakes occur on days of oppositions of Solar System planets (arranged in a single line). At that, the strongest earthquakes occur under the location of vector "Sun-Solar System barycenter" in the ecliptic plane. Details of this astronomical multivariate indicator still require further research, but it's practical significant is confirmed by practice. Another one empirical indicator of approaching earthquake M6+ is a synchronous variation of meteorological parameters: abrupt decreasing of minimal daily temperature, increasing of relative humidity, abrupt change of atmospheric pressure (RAMES method). Time difference of predicted and actual date is no more than one day. This indicator is registered 104 days before the earthquake, so it was called as Harmonic 104 or H-104. This fact looks paradoxical, but the works of A. Sytinskiy and V. Bokov on the correlation of global atmospheric circulation and seismic events give a physical basis for this empirical fact. Also, 104 days is a quarter of a Chandler period so this fact gives insight on the correlation between the anomalies of Earth orientation

  11. A prospective earthquake forecast experiment in the western Pacific

    NASA Astrophysics Data System (ADS)

    Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan

    2012-09-01

    Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.

  12. Understanding earthquake from the granular physics point of view — Causes of earthquake, earthquake precursors and predictions

    NASA Astrophysics Data System (ADS)

    Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.

  13. Large Earthquakes Disrupt Groundwater System by Breaching Aquitards

    NASA Astrophysics Data System (ADS)

    Wang, C. Y.; Manga, M.; Liao, X.; Wang, L. P.

    2016-12-01

    Changes of groundwater system by large earthquakes are widely recognized. Some changes have been attributed to increases in the vertical permeability but basic questions remain: How do increases in the vertical permeability occur? How frequent do they occur? How fast does the vertical permeability recover after the earthquake? Is there a quantitative measure for detecting the occurrence of aquitard breaching? Here we attempt to answer these questions by examining data accumulated in the past 15 years. Analyses of increased stream discharges and their geochemistry after large earthquakes show evidence that the excess water originates from groundwater released from high elevations by large increase of the vertical permeability. Water-level data from a dense network of clustered wells in a sedimentary basin near the epicenter of the 1999 M7.6 Chi-Chi earthquake in western Taiwan show that, while most confined aquifers remained confined after the earthquake, about 10% of the clustered wells show evidence of coseismic breaching of aquitards and a great increase of the vertical permeability. Water level in wells without evidence of coseismic breaching of aquitards show similar tidal response before and after the earthquake; wells with evidence of coseismic breaching of aquitards, on the other hand, show distinctly different tidal response before and after the earthquake and that the aquifers became hydraulically connected for many months thereafter. Breaching of aquitards by large earthquakes has significant implications for a number of societal issues such as the safety of water resources, the security of underground waste repositories, and the production of oil and gas. The method demonstrated here may be used for detecting the occurrence of aquitard breaching by large earthquakes in other seismically active areas.

  14. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  15. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed Central

    Grant, Rachel A.; Conlan, Hilary

    2013-01-01

    juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co-incidence. Statistical analysis of the data indicated frog swarms are unlikely to be connected with earthquakes. Reports of unusual behaviour giving rise to earthquake fears should be interpreted with caution, and consultation with experts in the field of earthquake biology is advised. PMID:26479746

  16. SLAMMER: Seismic LAndslide Movement Modeled using Earthquake Records

    USGS Publications Warehouse

    Jibson, Randall W.; Rathje, Ellen M.; Jibson, Matthew W.; Lee, Yong W.

    2013-01-01

    This program is designed to facilitate conducting sliding-block analysis (also called permanent-deformation analysis) of slopes in order to estimate slope behavior during earthquakes. The program allows selection from among more than 2,100 strong-motion records from 28 earthquakes and allows users to add their own records to the collection. Any number of earthquake records can be selected using a search interface that selects records based on desired properties. Sliding-block analyses, using any combination of rigid-block (Newmark), decoupled, and fully coupled methods, are then conducted on the selected group of records, and results are compiled in both graphical and tabular form. Simplified methods for conducting each type of analysis are also included.

  17. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  18. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Strong Ground Motion

    USGS Publications Warehouse

    Borcherdt, Roger D.

    1994-01-01

    Strong ground motion generated by the Loma Prieta, Calif., earthquake (MS~7.1) of October 17, 1989, resulted in at least 63 deaths, more than 3,757 injuries, and damage estimated to exceed $5.9 billion. Strong ground motion severely damaged critical lifelines (freeway overpasses, bridges, and pipelines), caused severe damage to poorly constructed buildings, and induced a significant number of ground failures associated with liquefaction and landsliding. It also caused a significant proportion of the damage and loss of life at distances as far as 100 km from the epicenter. Consequently, understanding the characteristics of the strong ground motion associated with the earthquake is fundamental to understanding the earthquake's devastating impact on society. The papers assembled in this chapter address this problem. Damage to vulnerable structures from the earthquake varied substantially with the distance from the causative fault and the type of underlying geologic deposits. Most of the damage and loss of life occurred in areas underlain by 'soft soil'. Quantifying these effects is important for understanding the tragic concentrations of damage in such areas as Santa Cruz and the Marina and Embarcadero Districts of San Francisco, and the failures of the San Francisco-Oakland Bay Bridge and the Interstate Highway 880 overpass. Most importantly, understanding these effects is a necessary prerequisite for improving mitigation measures for larger earthquakes likely to occur much closer to densely urbanized areas in the San Francisco Bay region. The earthquake generated an especially important data set for understanding variations in the severity of strong ground motion. Instrumental strong-motion recordings were obtained at 131 sites located from about 6 to 175 km from the rupture zone. This set of recordings, the largest yet collected for an event of this size, was obtained from sites on various geologic deposits, including a unique set on 'soft soil' deposits

  19. Earthquake warning system for Japan Railways’ bullet train; implications for disaster prevention in California

    USGS Publications Warehouse

    Nakamura, Y.; Tucker, B. E.

    1988-01-01

    Today, Japanese society is well aware of the prediction of the Tokai earthquake. It is estimated by the Tokyo earthquake. It is estimated by the Tokyo muncipal government that this predicted earthquake could kill 30,000 people. (this estimate is viewed by many as conservative; other Japanese government agencies have made estimates but they have not been published.) Reduction in the number deaths from 120,000 to 30,000 between the Kanto earthquake and the predicted Tokai earthquake is due in large part to the reduction in the proportion of wooden construction (houses). 

  20. Earthquake history of the Republic of Ragusa (today Dubrovnik, Croatia) (Invited)

    NASA Astrophysics Data System (ADS)

    Albini, P.; Rovida, A.; Locati, M.

    2009-12-01

    Among the towns constellating the Dalmatian coast, Ragusa (today Dubrovnik, Croatia), stands out, both because of its location in the middle of the Eastern Adriatic coast and its long-lasting, independent history of a Modern Age town and its small coastal territory. An important intelligence crossroads, squeezed as it was in between powerful and influential neighbours, such as the Ottoman Empire and the Republic of Venice, in its history (1358-1808) the Republic of Ragusa did experience heavily damaging earthquakes. We narrate the story of these earthquakes, which were recorded in the historical documentation of the Republic (today stored at the State Archives of Dubrovnik - Drzavni arhiv u Dubrovniku) as well as in documents from officers of other Mediterranean countries and letters of individuals. Of special note is the 6 April 1667 earthquake, which inflicted a permanent scar on the Republic. The earthquake's direct effects and their consequences caused a serious financial crisis, so critical that it took over 50 years for Ragusa to recover. This large earthquake is reappraised on the basis of newly investigated sources, and effects of the damage within the city walls are detailed. A seismic history of Ragusa is finally proposed, supported by full-text coeval records.

  1. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  2. Lessons learned from the 12 May 2008 Wenchuan earthquake: Impact on industry

    NASA Astrophysics Data System (ADS)

    Krausmann, E.; Cruz, A. M.; Affeltranger, B.

    2009-04-01

    The earthquake that shook Wenchuan County in China's Sichuan Province on 12 May 2008 was a major event with a moment magnitude of MW = 7.9 and a depth of only 19 km. It caused a fault rupture of 270 km length and affected a total area of about 500,000 km2. With the intensity reaching XI in the region near the epicentre and peak ground acceleration values as high as 0.63g the earthquake killed almost 70,000 people, injured over 374,000 and rendered 5,000,000 homeless. Over 18,000 are still listed as missing. Prior to the earthquake the area was considered a region of moderate seismicity with a design intensity of 7. Sichuan Province is home to a significant proportion of Chinese chemical and nuclear industry and consequently has a very strong economy. The direct economic loss due to the earthquake amounts to over 1.1 billion Euros. In addition to economic damage there is also concern about earthquake-triggered damage to and destruction of industrial facilities housing or processing hazardous substances and the potential consequences of their release to man or the environment. In order to understand how well the chemical industry fared in the earthquake-affected areas a reconnaissance field trip was organised from 15-21 November, 2008, which included visits to industry in Deyang, Shifang, Mianzhu, Mianyang, Anxian and Dujiangyan. In total we collected information on earthquake effects at 18 industrial facilities. Lessons learned from this reconnaissance field trip confirm the devastating consequences that natural disasters can have on industrial facilities. In addition to casualties and environmental harm the economic losses due to damage, prolonged shut-down periods and business interruption are often ruinous and may result in lay-off of workers. In the case of the visited facilities the shut-down time was up to 6 months. Two facilities were damaged beyond repair and have resulted in significant ammonia, sulphuric acid and other releases that in addition to

  3. Limits on great earthquake size at subduction zones

    NASA Astrophysics Data System (ADS)

    McCaffrey, R.

    2012-12-01

    Subduction zones are where the world's greatest earthquakes occur due to the large fault area available to slip. Yet some subduction zones are thought to be immune from these massive events, where quake size is limited by some physical processes or properties. Accordingly, the size of the 2011 Tohoku-oki Mw 9.0 earthquake caught some in the earthquake research community by surprise. The expectations of these massive quakes have been driven in the past by reliance on our short, incomplete history of earthquakes and causal relationships derived from it. The logic applied is that if a great earthquake has not happened in the past, that we know of, one cannot happen in the future. Using the ~100-year global earthquake seismological history, and in some cases extended with geologic observations, relationships between maximum earthquake sizes and other properties of subduction zones are suggested, leading to the notion that some subduction zones, like the Japan Trench, would never produce a magnitude ~9 event. Empirical correlations of earthquake behavior with other subduction parameters can give false positive results when the data are incomplete or incorrect, of small numbers and numerous attributes are examined. Given multi-century return times of the greatest earthquakes, ignorance of those return times and our relatively limited temporal observation span (in most places), I suggest that we cannot yet rule out great earthquakes at any subduction zones. Alternatively, using the length of a subduction zone that is available for slip as the predominant factor in determining maximum earthquake size, we cannot rule out that any subduction zone of a few hundred kilometers or more in length may be capable of producing a magnitude 9 or larger earthquake. Based on this method, the expected maximum size for the Japan Trench was 9.0 (McCaffrey, Geology, p. 263, 2008). The same approach indicates that a M > 9 off Java, with twice the population density as Honshu and much lower

  4. Geomorphic legacy of medieval Himalayan earthquakes in the Pokhara Valley

    NASA Astrophysics Data System (ADS)

    Schwanghart, Wolfgang; Bernhardt, Anne; Stolle, Amelie; Hoelzmann, Philipp; Adhikari, Basanta R.; Andermann, Christoff; Tofelde, Stefanie; Merchel, Silke; Rugel, Georg; Fort, Monique; Korup, Oliver

    2016-04-01

    The Himalayas and their foreland belong to the world's most earthquake-prone regions. With millions of people at risk from severe ground shaking and associated damages, reliable data on the spatial and temporal occurrence of past major earthquakes is urgently needed to inform seismic risk analysis. Beyond the instrumental record such information has been largely based on historical accounts and trench studies. Written records provide evidence for damages and fatalities, yet are difficult to interpret when derived from the far-field. Trench studies, in turn, offer information on rupture histories, lengths and displacements along faults but involve high chronological uncertainties and fail to record earthquakes that do not rupture the surface. Thus, additional and independent information is required for developing reliable earthquake histories. Here, we present exceptionally well-dated evidence of catastrophic valley infill in the Pokhara Valley, Nepal. Bayesian calibration of radiocarbon dates from peat beds, plant macrofossils, and humic silts in fine-grained tributary sediments yields a robust age distribution that matches the timing of nearby M>8 earthquakes in ~1100, 1255, and 1344 AD. The upstream dip of tributary valley fills and X-ray fluorescence spectrometry of their provenance rule out local sediment sources. Instead, geomorphic and sedimentary evidence is consistent with catastrophic fluvial aggradation and debris flows that had plugged several tributaries with tens of meters of calcareous sediment from the Annapurna Massif >60 km away. The landscape-changing consequences of past large Himalayan earthquakes have so far been elusive. Catastrophic aggradation in the wake of two historically documented medieval earthquakes and one inferred from trench studies underscores that Himalayan valley fills should be considered as potential archives of past earthquakes. Such valley fills are pervasive in the Lesser Himalaya though high erosion rates reduce

  5. The August 2011 Virginia and Colorado Earthquake Sequences: Does Stress Drop Depend on Strain Rate?

    NASA Astrophysics Data System (ADS)

    Abercrombie, R. E.; Viegas, G.

    2011-12-01

    Our preliminary analysis of the August 2011 Virginia earthquake sequence finds the earthquakes to have high stress drops, similar to those of recent earthquakes in NE USA, while those of the August 2011 Trinidad, Colorado, earthquakes are moderate - in between those typical of interplate (California) and the east coast. These earthquakes provide an unprecedented opportunity to study such source differences in detail, and hence improve our estimates of seismic hazard. Previously, the lack of well-recorded earthquakes in the eastern USA severely limited our resolution of the source processes and hence the expected ground accelerations. Our preliminary findings are consistent with the idea that earthquake faults strengthen during longer recurrence times and intraplate faults fail at higher stress (and produce higher ground accelerations) than their interplate counterparts. We use the empirical Green's function (EGF) method to calculate source parameters for the Virginia mainshock and three larger aftershocks, and for the Trinidad mainshock and two larger foreshocks using IRIS-available stations. We select time windows around the direct P and S waves at the closest stations and calculate spectral ratios and source time functions using the multi-taper spectral approach (eg. Viegas et al., JGR 2010). Our preliminary results show that the Virginia sequence has high stress drops (~100-200 MPa, using Madariaga (1976) model), and the Colorado sequence has moderate stress drops (~20 MPa). These numbers are consistent with previous work in the regions, for example the Au Sable Forks (2002) earthquake, and the 2010 Germantown (MD) earthquake. We also calculate the radiated seismic energy and find the energy/moment ratio to be high for the Virginia earthquakes, and moderate for the Colorado sequence. We observe no evidence of a breakdown in constant stress drop scaling in this limited number of earthquakes. We extend our analysis to a larger number of earthquakes and stations

  6. Statistical validation of earthquake related observations

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  7. Earthquake: Game-based learning for 21st century STEM education

    NASA Astrophysics Data System (ADS)

    Perkins, Abigail Christine

    doubled the number of exhibited instances of critical thinking between games. Players in the first group exhibited about a third more instances of metacognition between games, while players in the second group doubled such instances. Between games, players in both groups more than doubled the number of exhibited instances of using earthquake engineering content knowledge. The student-players expanded use of scientific argumentation for all game-based learning checklist categories. With empirical evidence, I conclude play and learning can connect for successful 21 st century STEM education.

  8. Geoelectric precursors to strong earthquakes in China

    NASA Astrophysics Data System (ADS)

    Yulin, Zhao; Fuye, Qian

    1994-05-01

    The main results of searching for electrical precursors to strong earthquakes in China for the last 25 yr are presented. This comprises: the continuous twenty-year resistivity record before and after the great Tangshan earthquake of 1976; spatial and temporal variations in resistivity anomalies observed at more than 6 stations within 150 km of the Tangshan earthquake epicenter; the travel-time curve for the front of the resistivity precursor; and a method of intersection for predicting the epicenter location. These results reveal a number of interesting facts: (1) Resistivity measurements with accuracies of 0.5% or better for over 20 yr show that resistivity decreases of several percent, which began approximately 3 yr prior to the Tangshan earthquake, were larger than the background fluctuations and hence statistically significant. An outstanding example of an intermediate-term resistivity precursor is given. (2) The intermediate-term resistivity precursor decrease before Tangshan earthquake is such a pervasive phenomenon that the mean decrease, in percent, can be contoured on a map of the Beijing-Tianjin-Tangshan region. This shows the maximum decrease centered over the epicenter. (3) The anomalies in resistivity and self-potential, which began 2-0.5 months before the Tangshan main shock, had periods equal to that of the tidal waves M 2 and MS f, respectively, so that the associated anomalies can be identified as impending-earthquake precursors and a modal related to stress-displacement weakening is proposed.

  9. The mechanism of earthquake

    NASA Astrophysics Data System (ADS)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    strength of crust rocks: The gravitational pressure can initiate the elasticity-plasticity transition in crust rocks. By calculating the depth dependence of elasticity-plasticity transition and according to the actual situation analysis, the behaviors of crust rocks can be categorized in three typical zones: elastic, partially plastic and fully plastic. As the proportion of plastic portion reaches about 10% in the partially plastic zone, plastic interconnection may occur and the variation of shear strength in rocks is mainly characterized by plastic behavior. The equivalent coefficient of friction for the plastic slip is smaller by an order of magnitude, or even less than that for brittle fracture, thus the shear strength of rocks by plastic sliding is much less than that by brittle breaking. Moreover, with increasing depth a number of other factors can further reduce the shear yield strength of rocks. On the other hand, since earthquake is a large-scale damage, the rock breaking must occur along the weakest path. Therefore, the actual fracture strength of rocks in a shallow earthquake is assuredly lower than the average shear strength of rocks as generally observed. The typical distributions of the average strength and actual fracture strength in crustal rocks varying with depth are schematically illustrated. (3) The conditions for earthquake occurrence and mechanisms of earthquake: An earthquake will lead to volume expansion, and volume expansion must break through the obstacle. The condition for an earthquake to occur is as follows: the tectonic force exceeds the sum of the fracture strength of rock, the friction force of fault boundary and the resistance from obstacles. Therefore, the shallow earthquake is characterized by plastic sliding of rocks that break through the obstacles. Accordingly, four possible patterns for shallow earthquakes are put forward. Deep-focus earthquakes are believed to result from a wide-range rock flow that breaks the jam. Both shallow

  10. Earthquake Protection Measures for People with Disabilities

    NASA Astrophysics Data System (ADS)

    Gountromichou, C.; Kourou, A.; Kerpelis, P.

    2009-04-01

    The problem of seismic safety for people with disabilities not only exists but is also urgent and of primary importance. Working towards disability equality, Earthquake Planning and Protection Organization of Greece (E.P.P.O.) has developed an educational scheme for people with disabilities in order to guide them to develop skills to protect themselves as well as to take the appropriate safety measures before, during and after an earthquake. The framework of this initiative includes a number of actions have been already undertaken, including the following: a. Recently, the main guidelines have been published to help people who have physical, cognitive, visual, or auditory disabilities to cope with a destructive earthquake. Of great importance, in case of people with disabilities, is to be prepared for the disaster, with several measures that must be taken starting today. In the pre-earthquake period, it is important that these people, in addition to other measures, do the following: - Create a Personal Support Network The Personal Support Network should be a group of at least three trustful people that can assist the disabled person to prepare for a disastrous event and to recover after it. - Complete a Personal Assessment The environment may change after a destructive earthquake. People with disabilities are encouraged to make a list of their personal needs and their resources for meeting them in a disaster environment. b. Lectures and training seminars on earthquake protection are given for students, teachers and educators in Special Schools for disabled people, mainly for informing and familiarizing them with earthquakes and with safety measures. c. Many earthquake drills have already taken place, for each disability, in order to share good practices and lessons learned to further disaster reduction and to identify gaps and challenges. The final aim of this action is all people with disabilities to be well informed and motivated towards a culture of earthquake

  11. Crustal earthquake triggering by pre-historic great earthquakes on subduction zone thrusts

    USGS Publications Warehouse

    Sherrod, Brian; Gomberg, Joan

    2014-01-01

    Triggering of earthquakes on upper plate faults during and shortly after recent great (M>8.0) subduction thrust earthquakes raises concerns about earthquake triggering following Cascadia subduction zone earthquakes. Of particular regard to Cascadia was the previously noted, but only qualitatively identified, clustering of M>~6.5 crustal earthquakes in the Puget Sound region between about 1200–900 cal yr B.P. and the possibility that this was triggered by a great Cascadia thrust subduction thrust earthquake, and therefore portends future such clusters. We confirm quantitatively the extraordinary nature of the Puget Sound region crustal earthquake clustering between 1200–900 cal yr B.P., at least over the last 16,000. We conclude that this cluster was not triggered by the penultimate, and possibly full-margin, great Cascadia subduction thrust earthquake. However, we also show that the paleoseismic record for Cascadia is consistent with conclusions of our companion study of the global modern record outside Cascadia, that M>8.6 subduction thrust events have a high probability of triggering at least one or more M>~6.5 crustal earthquakes.

  12. Facilitation of intermediate-depth earthquakes by eclogitization-related stresses and H2O

    NASA Astrophysics Data System (ADS)

    Nakajima, J.; Uchida, N.; Hasegawa, A.; Shiina, T.; Hacker, B. R.; Kirby, S. H.

    2012-12-01

    Generation of intermediate-depth earthquakes is an ongoing enigma because high lithostatic pressures render ordinary dry frictional failure unlikely. A popular hypothesis to solve this conundrum is fluid-related embrittlement (e.g., Kirby et al., 1996; Preston et al., 2003), which is known to work even for dehydration reactions with negative volume change (Jung et al., 2004). One consequence of reaction with the negative volume change is the formation of a paired stress field as a result of strain compatibility across the reaction front (Hacker, 1996; Kirby et al., 1996). Here we analyze waveforms of a tiny seismic cluster in the lower crust of the downgoing Pacific plate at a depth of 155 km and propose new evidence in favor of this mechanism: tensional earthquakes lying 1 km above compressional earthquakes, and earthquakes with highly similar waveforms lying on well-defined planes with complementary rupture areas. The tensional stress is interpreted to be caused by the dimensional mismatch between crust transformed to eclogite and underlying untransformed crust, and the earthquakes are interpreted to be facilitated by fluid produced by eclogitization. These observations provide seismic evidence for the dual roles of volume-change related stresses and fluid-related embrittlement as viable processes for nucleating earthquakes in downgoing oceanic lithosphere.

  13. Principles for selecting earthquake motions in engineering design of large dams

    USGS Publications Warehouse

    Krinitzsky, E.L.; Marcuson, William F.

    1983-01-01

    This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at

  14. The Long-term Impacts of Earthquakes on Economic Growth

    NASA Astrophysics Data System (ADS)

    Lackner, S.

    2016-12-01

    The social science literature has so far not reached a consensus on whether and how earthquakes actually impact economic growth in the long-run. Several hypotheses have been suggested and some even argue for a positive impact. A general weakness in the literature, however, is the predominant use of inadequate measures for the exogenous natural hazard of an earthquake. The most common problems are the lack of individual event size (e.g. earthquake dummy or number of events), the use of magnitude instead of a measure for surface shaking, and endogeneity issues when traditional qualitative intensity scales or actual impact data is used. Here we use peak ground acceleration (PGA) as the ground motion intensity measure and investigate the impacts of earthquake shaking on long-run economic growth. We construct a data set from USGS ShakeMaps that can be considered the universe of global relevant earthquake ground shaking from 1973 to 2014. This data set is then combined with World Bank GDP data to conduct a regression analysis. Furthermore, the impacts of PGA on different industries and other economic variables such as employment and education are also investigated. This will on one hand help to identify the mechanism of how earthquakes impact long-run growth and also show potential impacts on other welfare indicators that are not captured by GDP. This is the first application of global earthquake shaking data to investigate long-term earthquake impacts.

  15. Impact of traumatic loss on post-traumatic spectrum symptoms in high school students after the L'Aquila 2009 earthquake in Italy.

    PubMed

    Dell'OSso, L; Carmassi, C; Massimetti, G; Conversano, C; Daneluzzo, E; Riccardi, I; Stratta, P; Rossi, A

    2011-11-01

    On April 6th 2009, the town of L'Aquila, Italy, was struck by an earthquake (6.3 on the Richter scale) that lead large parts of the town to be destroyed and the death of 309 people. Significant losses in the framework of earthquakes have been reported as a major risk factor for PTSD development. Aim of this study was to investigate post-traumatic spectrum symptoms in a sample of adolescents exposed to the L'Aquila 2009 earthquake 21 months earlier, with particular attention to the impact of loss. 475 students (203 women and 272 men), attending the last year of High School in L'Aquila, were assessed by: Trauma and Loss Spectrum-Self Report (TALS-SR) and Impact of Event Scale (IES). The presence of full and partial PTSD was also assessed. 72 students (15.2%) reported the loss of a close friend or relative in the framework of the earthquake. Full PTSD was reported by 146 (30.7%) students and partial PTSD by 149 (31.4%) students. There was a significant difference reported in PTSD between bereaved and non bereaved subjects. Significantly higher post-traumatic symptom levels were reported by bereaved subjects. The lack of information on the relationship with the deceased and the number of losses experienced, besides the use of self report instruments are the limitations of this study. Our results show high rates of post-traumatic spectrum symptoms in adolescents who survived the L'Aquila earthquake. Having experienced the loss of a close friend or a relative in the framework of the earthquake seems to be related to higher PTSD rates and more severe symptomatology. These results highlight the need to carefully explore adolescents exposed to a significant loss as consequence of an earthquake. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Impact of earthquakes on sex ratio at birth: Eastern Marmara earthquakes

    PubMed Central

    Doğer, Emek; Çakıroğlu, Yiğit; Köpük, Şule Yıldırım; Ceylan, Yasin; Şimşek, Hayal Uzelli; Çalışkan, Eray

    2013-01-01

    Objective: Previous reports suggest that maternal exposure to acute stress related to earthquakes affects the sex ratio at birth. Our aim was to examine the change in sex ratio at birth after Eastern Marmara earthquake disasters. Material and Methods: This study was performed using the official birth statistics from January 1997 to December 2002 – before and after 17 August 1999, the date of the Golcuk Earthquake – supplied from the Turkey Statistics Institute. The secondary sex ratio was expressed as the male proportion at birth, and the ratio of both affected and unaffected areas were calculated and compared on a monthly basis using data from gender with using the Chi-square test. Results: We observed significant decreases in the secondary sex ratio in the 4th and 8th months following an earthquake in the affected region compared to the unaffected region (p= 0.001 and p= 0.024). In the earthquake region, the decrease observed in the secondary sex ratio during the 8th month after an earthquake was specific to the period after the earthquake. Conclusion: Our study indicated a significant reduction in the secondary sex ratio after an earthquake. With these findings, events that cause sudden intense stress such as earthquakes can have an effect on the sex ratio at birth. PMID:24592082

  17. Large-Scale Earthquake Countermeasures Act and the Earthquake Prediction Council in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rikitake, T.

    1979-08-07

    The Large-Scale Earthquake Countermeasures Act was enacted in Japan in December 1978. This act aims at mitigating earthquake hazards by designating an area to be an area under intensified measures against earthquake disaster, such designation being based on long-term earthquake prediction information, and by issuing an earthquake warnings statement based on imminent prediction information, when possible. In an emergency case as defined by the law, the prime minister will be empowered to take various actions which cannot be taken at ordinary times. For instance, he may ask the Self-Defense Force to come into the earthquake-threatened area before the earthquake occurrence.more » A Prediction Council has been formed in order to evaluate premonitory effects that might be observed over the Tokai area, which was designated an area under intensified measures against earthquake disaster some time in June 1979. An extremely dense observation network has been constructed over the area.« less

  18. Source properties of earthquakes near the Salton Sea triggered by the 16 October 1999 M 7.1 Hector Mine, California, earthquake

    USGS Publications Warehouse

    Hough, S.E.; Kanamori, H.

    2002-01-01

    We analyze the source properties of a sequence of triggered earthquakes that occurred near the Salton Sea in southern California in the immediate aftermath of the M 7.1 Hector Mine earthquake of 16 October 1999. The sequence produced a number of early events that were not initially located by the regional network, including two moderate earthquakes: the first within 30 sec of the P-wave arrival and a second approximately 10 minutes after the mainshock. We use available amplitude and waveform data from these events to estimate magnitudes to be approximately 4.7 and 4.4, respectively, and to obtain crude estimates of their locations. The sequence of small events following the initial M 4.7 earthquake is clustered and suggestive of a local aftershock sequence. Using both broadband TriNet data and analog data from the Southern California Seismic Network (SCSN), we also investigate the spectral characteristics of the M 4.4 event and other triggered earthquakes using empirical Green's function (EGF) analysis. We find that the source spectra of the events are consistent with expectations for tectonic (brittle shear failure) earthquakes, and infer stress drop values of 0.1 to 6 MPa for six M 2.1 to M 4.4 events. The estimated stress drop values are within the range observed for tectonic earthquakes elsewhere. They are relatively low compared to typically observed stress drop values, which is consistent with expectations for faulting in an extensional, high heat flow regime. The results therefore suggest that, at least in this case, triggered earthquakes are associated with a brittle shear failure mechanism. This further suggests that triggered earthquakes may tend to occur in geothermal-volcanic regions because shear failure occurs at, and can be triggered by, relatively low stresses in extensional regimes.

  19. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    USGS Publications Warehouse

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  20. Determining on-fault earthquake magnitude distributions from integer programming

    NASA Astrophysics Data System (ADS)

    Geist, Eric L.; Parsons, Tom

    2018-02-01

    Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106 variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions.

  1. Determining on-fault earthquake magnitude distributions from integer programming

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.

    2018-01-01

    Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106  variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions. 

  2. Impact of the 2016 Ecuador Earthquake on Zika Virus Cases

    PubMed Central

    Vasquez, Diego; Palacio, Ana; Nuñez, Jose; Briones, Wladimir; Beier, John C.; Tamariz, Leonardo

    2017-01-01

    Objectives. To evaluate the impact of the April 2016 7.8-magnitude earthquake in Ecuador on the incidence of Zika virus (ZIKV) cases. Methods. We used the national public health surveillance system for reportable transmissible conditions and included suspected and laboratory-confirmed ZIKV cases. We compared the number of cases before and after the earthquake in areas closer to and farther from the epicenter. Results. From January to July 2016, 2234 patients suspected of having ZIKV infection were reported in both affected and control areas. A total of 1110 patients had a reverse transcription-polymerase chain reaction assay, and 159 were positive for ZIKV. The cumulative incidence of ZIKV in the affected area was 11.1 per 100 000 after the earthquake. The odds ratio of having ZIKV infection in those living in the affected area was 8.0 (95% CI = 4.4, 14.6; P < .01) compared with the control area and adjusted for age, gender, province population, and number of government health care facilities. Conclusions. A spike in ZIKV cases occurred after the earthquake. Patients in the area closest to the epicenter had a delay in seeking care. PMID:28520489

  3. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  4. Acute myocardial infarction and stress cardiomyopathy following the Christchurch earthquakes.

    PubMed

    Chan, Christina; Elliott, John; Troughton, Richard; Frampton, Christopher; Smyth, David; Crozier, Ian; Bridgman, Paul

    2013-01-01

    Christchurch, New Zealand, was struck by 2 major earthquakes at 4:36 am on 4 September 2010, magnitude 7.1 and at 12:51 pm on 22 February 2011, magnitude 6.3. Both events caused widespread destruction. Christchurch Hospital was the region's only acute care hospital. It remained functional following both earthquakes. We were able to examine the effects of the 2 earthquakes on acute cardiac presentations. Patients admitted under Cardiology in Christchurch Hospital 3 week prior to and 5 weeks following both earthquakes were analysed, with corresponding control periods in September 2009 and February 2010. Patients were categorised based on diagnosis: ST elevation myocardial infarction, Non ST elevation myocardial infarction, stress cardiomyopathy, unstable angina, stable angina, non cardiac chest pain, arrhythmia and others. There was a significant increase in overall admissions (p<0.003), ST elevation myocardial infarction (p<0.016), and non cardiac chest pain (p<0.022) in the first 2 weeks following the early morning September earthquake. This pattern was not seen after the early afternoon February earthquake. Instead, there was a very large number of stress cardiomyopathy admissions with 21 cases (95% CI 2.6-6.4) in 4 days. There had been 6 stress cardiomyopathy cases after the first earthquake (95% CI 0.44-2.62). Statistical analysis showed this to be a significant difference between the earthquakes (p<0.05). The early morning September earthquake triggered a large increase in ST elevation myocardial infarction and a few stress cardiomyopathy cases. The early afternoon February earthquake caused significantly more stress cardiomyopathy. Two major earthquakes occurring at different times of day differed in their effect on acute cardiac events.

  5. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  6. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-02

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  7. Toward standardization of slow earthquake catalog -Development of database website-

    NASA Astrophysics Data System (ADS)

    Kano, M.; Aso, N.; Annoura, S.; Arai, R.; Ito, Y.; Kamaya, N.; Maury, J.; Nakamura, M.; Nishimura, T.; Obana, K.; Sugioka, H.; Takagi, R.; Takahashi, T.; Takeo, A.; Yamashita, Y.; Matsuzawa, T.; Ide, S.; Obara, K.

    2017-12-01

    work is supported by JSPS KAKENHI Grant Numbers JP16H06472, JP16H06473, JP16H06474, JP16H06477 in Scientific Research on Innovative Areas "Science of Slow Earthquakes", and JP15K17743 in Grant-in-Aid for Young Scientists (B).

  8. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  9. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

    NASA Astrophysics Data System (ADS)

    Pasten, D.

    2017-12-01

    The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

  10. Lake deposits record evidence of large post-1505 AD earthquakes in western Nepal

    NASA Astrophysics Data System (ADS)

    Ghazoui, Z.; Bertrand, S.; Vanneste, K.; Yokoyama, Y.; Van Der Beek, P.; Nomade, J.; Gajurel, A.

    2016-12-01

    According to historical records, the last large earthquake that ruptured the Main Frontal Thrust (MFT) in western Nepal occurred in 1505 AD. Since then, no evidence of other large earthquakes has been found in historical records or geological archives. In view of the catastrophic consequences to millions of inhabitants of Nepal and northern India, intense efforts currently focus on improving our understanding of past earthquake activity and complement the historical data on Himalayan earthquakes. Here we report a new record, based on earthquake-triggered turbidites in lakes. We use lake sediment records from Lake Rara, western Nepal, to reconstruct the occurrence of seismic events. The sediment cores were studied using a multi-proxy approach combining radiocarbon and 210Pb chronologies, physical properties (X-ray computerized axial tomography scan, Geotek multi-sensor core logger), high-resolution grain size, inorganic geochemistry (major elements by ITRAX XRF core scanning) and bulk organic geochemistry (C, N concentrations and stable isotopes). We identified several sequences of dense and layered fine sand mainly composed of mica, which we interpret as earthquake-triggered turbidites. Our results suggest the presence of a synchronous event between the two lake sites correlated with the well-known 1505 AD earthquake. In addition, our sediment records reveal five earthquake-triggered turbidites younger than the 1505 AD event. By comparison with historical archives, we relate one of those to the 1833 AD MFT rupture. The others may reflect successive ruptures of the Western Nepal Fault System. Our study sheds light on events that have not been recorded in historical chronicles. Those five MMI>7 earthquakes permit addressing the problem of missing slip on the MFT in western Nepal and reevaluating the risk of a large earthquake affecting western Nepal and North India.

  11. Analysis of Landslides Triggered by October 2005, Kashmir Earthquake

    PubMed Central

    Mahmood, Irfan; Qureshi, Shahid Nadeem; Tariq, Shahina; Atique, Luqman; Iqbal, Muhammad Farooq

    2015-01-01

    Introduction: The October 2005, Kashmir earthquake main event was triggered along the Balakot-Bagh Fault which runs from Bagh to Balakot, and caused more damages in and around these areas. Major landslides were activated during and after the earthquake inflicting large damages in the area, both in terms of infrastructure and casualties. These landslides were mainly attributed to the minimum threshold of the earthquake, geology of the area, climatologic and geomorphologic conditions, mudflows, widening of the roads without stability assessment, and heavy rainfall after the earthquake. These landslides were mainly rock and debris falls. Hattian Bala rock avalanche was largest landslide associated with the earthquake which completely destroyed a village and blocked the valley creating a lake. Discussion: The present study shows that the fault rupture and fault geometry have direct influence on the distribution of landslides and that along the rupture zone a high frequency band of landslides was triggered. There was an increase in number of landslides due to 2005 earthquake and its aftershocks and that most of earthquakes have occurred along faults, rivers and roads. It is observed that the stability of landslide mass is greatly influenced by amplitude, frequency and duration of earthquake induced ground motion. Most of the slope failures along the roads resulted from the alteration of these slopes during widening of the roads, and seepages during the rainy season immediately after the earthquake. Conclusion: Landslides occurred mostly along weakly cemented and indurated rocks, colluvial sand and cemented soils. It is also worth noting that fissures and ground crack which were induced by main and after shock are still present and they pose a major potential threat for future landslides in case of another earthquake activity or under extreme weather conditions. PMID:26366324

  12. Analysis of Landslides Triggered by October 2005, Kashmir Earthquake.

    PubMed

    Mahmood, Irfan; Qureshi, Shahid Nadeem; Tariq, Shahina; Atique, Luqman; Iqbal, Muhammad Farooq

    2015-08-26

    The October 2005, Kashmir earthquake main event was triggered along the Balakot-Bagh Fault which runs from Bagh to Balakot, and caused more damages in and around these areas. Major landslides were activated during and after the earthquake inflicting large damages in the area, both in terms of infrastructure and casualties. These landslides were mainly attributed to the minimum threshold of the earthquake, geology of the area, climatologic and geomorphologic conditions, mudflows, widening of the roads without stability assessment, and heavy rainfall after the earthquake. These landslides were mainly rock and debris falls. Hattian Bala rock avalanche was largest landslide associated with the earthquake which completely destroyed a village and blocked the valley creating a lake. The present study shows that the fault rupture and fault geometry have direct influence on the distribution of landslides and that along the rupture zone a high frequency band of landslides was triggered. There was an increase in number of landslides due to 2005 earthquake and its aftershocks and that most of earthquakes have occurred along faults, rivers and roads. It is observed that the stability of landslide mass is greatly influenced by amplitude, frequency and duration of earthquake induced ground motion. Most of the slope failures along the roads resulted from the alteration of these slopes during widening of the roads, and seepages during the rainy season immediately after the earthquake.  Landslides occurred mostly along weakly cemented and indurated rocks, colluvial sand and cemented soils. It is also worth noting that fissures and ground crack which were induced by main and after shock are still present and they pose a major potential threat for future landslides in case of another earthquake activity or under extreme weather conditions.

  13. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake

    USGS Publications Warehouse

    Hayes, Gavin P.; Herman, Matthew W.; Barnhart, William D.; Furlong, Kevin P.; Riquelme, Sebástian; Benz, Harley M.; Bergman, Eric; Barrientos, Sergio; Earle, Paul S.; Samsonov, Sergey

    2014-01-01

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile which had not ruptured in a megathrust earthquake since a M ~8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March–April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  14. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake.

    PubMed

    Hayes, Gavin P; Herman, Matthew W; Barnhart, William D; Furlong, Kevin P; Riquelme, Sebástian; Benz, Harley M; Bergman, Eric; Barrientos, Sergio; Earle, Paul S; Samsonov, Sergey

    2014-08-21

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which had not ruptured in a megathrust earthquake since a M ∼8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March-April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  15. Real-Time Earthquake Intensity Estimation Using Streaming Data Analysis of Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Yelena; Tiampo, Kristy F.; Qin, Jinhui; Bauer, Michael A.

    2017-06-01

    Earthquake intensity is one of the key components of the decision-making process for disaster response and emergency services. Accurate and rapid intensity calculations can help to reduce total loss and the number of casualties after an earthquake. Modern intensity assessment procedures handle a variety of information sources, which can be divided into two main categories. The first type of data is that derived from physical sensors, such as seismographs and accelerometers, while the second type consists of data obtained from social sensors, such as witness observations of the consequences of the earthquake itself. Estimation approaches using additional data sources or that combine sources from both data types tend to increase intensity uncertainty due to human factors and inadequate procedures for temporal and spatial estimation, resulting in precision errors in both time and space. Here we present a processing approach for the real-time analysis of streams of data from both source types. The physical sensor data is acquired from the U.S. Geological Survey (USGS) seismic network in California and the social sensor data is based on Twitter user observations. First, empirical relationships between tweet rate and observed Modified Mercalli Intensity (MMI) are developed using data from the M6.0 South Napa, CAF earthquake that occurred on August 24, 2014. Second, the streams of both data types are analyzed together in simulated real-time to produce one intensity map. The second implementation is based on IBM InfoSphere Streams, a cloud platform for real-time analytics of big data. To handle large processing workloads for data from various sources, it is deployed and run on a cloud-based cluster of virtual machines. We compare the quality and evolution of intensity maps from different data sources over 10-min time intervals immediately following the earthquake. Results from the joint analysis shows that it provides more complete coverage, with better accuracy and higher

  16. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, Susan E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  17. Gravity drives Great Earthquakes

    NASA Astrophysics Data System (ADS)

    Lister, Gordon; Forster, Marnie

    2010-05-01

    The most violent of Great Earthquakes are driven by ruptures on giant megathrusts adjacent to actively forming mountain belts. Current theory suggests that the seismic rupture harvests (and thus releases) elastic energy that has been previously stored in locked segments of the megathrust. The general belief, however, is that this energy was accumulated as the result of relative motion of the adjacent stiff elastic tectonic plates. This mechanism fails to explain many first order aspects of large earthquakes, however. The energy source for strain accumulation must also include gravitational collapse of orogenic crust and/or in the foundering (or roll-back) of an adjacent subducting lithospheric slab. Therefore we have conducted an analysis of the geometry of aftershocks, and report that this allows distinction of two types of failure on giant megathrusts. Mode I failure involves horizontal shortening, and is consistent with the classic view that megathrusts fail in compression, with motion analogous to that expected if accretion takes place against a rigid (or elastic) backstop. Mode II failure involves horizontal extension, and requires the over-riding plate to stretch during an earthquake. This process is likely to continue during the subsequent period of afterslip, and therefore will again be evident in aftershock patterns. Mode I behaviour may well have applied to the southern segment of the Sumatran megathrust, from whence emanated the rupture that drove the 2004 Great Earthquake. Mode II behaviour appears to apply to the northern segment of the same rupture, however. The geometry of aftershocks beneath the Andaman Sea suggest that the crust above the initial rupture failed in an extensional mode. The edge of the Indian plate is foundering, with slab-hinge roll-back in a direction orthogonal to its motion vector. The only possible cause for this extension therefore is westward roll-back of the subducting Indian plate, and the consequent gravity-driven movement

  18. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries.

    PubMed

    Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.

  19. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries

    PubMed Central

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006–2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year. PMID:26812351

  20. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    NASA Astrophysics Data System (ADS)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  1. Preparing a population for an earthquake like Chi-Chi: The Great Southern California ShakeOut

    USGS Publications Warehouse

    Jones, Lucile M.; ,

    2009-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million southern Californians pretended that a magnitude-7.8 earthquake had occurred and practiced actions that could reduce its impact on their lives. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The drill was based on a scenario of the impacts and consequences of such an earthquake on the Southern San Andreas Fault, developed by over 300 experts led by the U.S. Geological Survey in partnership with the California Geological Survey, the Southern California Earthquake Center, Earthquake Engineering Research Institute, lifeline operators, emergency services and many other organizations. The ShakeOut campaign was designed and implemented by earthquake scientists, emergency managers, sociologists, art designers and community participants. The means of communication were developed using results from sociological research on what encouraged people to take action. This was structured around four objectives: 1) consistent messages – people are more inclined to believe something when they hear the same thing from multiple sources; 2) visual reinforcement – people are more inclined to do something they see other people doing; 3) encourage “milling” or discussing contemplated action – people need to discuss an action with others they care about before committing to undertaking it; and 4) focus on concrete actions – people are more likely to prepare for a set of concrete consequences of a particular hazard than for an abstract concept of risk. The goals of the ShakeOut were established in Spring 2008 and were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in southern California; and 3) to reduce earthquake losses in southern California. All of these

  2. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  3. The Earthquake Early Warning System in Japan (Invited)

    NASA Astrophysics Data System (ADS)

    Mori, J. J.; Yamada, M.

    2010-12-01

    In Japan, the earthquake early warning system (Kinkyu Jishin Sokuhou in Japanese) maintained by the Japan Meterological Agency (JMA) has been in operation and sending pubic information since October 1, 2007. Messages have been broadcast on television and radio to warn of strong shaking to the public. The threshold for broadcasting a message is an estimated intensity of JMA 5 lower, which is approximately equivalent to MM VII to VIII. During the period from October 2007 through August 2010, messages have been sent 9 times for earthquakes of magnitude 5.2 to 7.0. There have been a few instances of significantly over-estimating or under-estimating the predicted shaking, but in general the performance of the system has been quite good. The quality of the detection system depends on the dense network of high-quality seismometers that cover the Japanese Islands. Consequently, the system works very well for events on or close to the 4 main islands, but there is more uncertainty for events near the smaller and more distant islands where the density of instrumentation is much less The Early Warning System is also tied to an extensive education program so that the public can react appropriately in the short amount of time given by the warning. There appears to be good public support in Japan, where people have become accustomed to a high level of fast information on a daily basis. There has also been development of a number of specific safety applications in schools and industry that work off the backbone information provided in the national system.

  4. An improvement of the Earthworm Based Earthquake Alarm Reporting system in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, D. Y.; Hsiao, N. C.; Yih-Min, W.

    2017-12-01

    The Central Weather Bureau of Taiwan (CWB) has operated the Earthworm Based Earthquake Alarm Reporting (eBEAR) system for the purpose of earthquake early warning (EEW). The system has been used to report EEW messages to the general public since 2016 through text message from the mobile phones and the television programs. The system for inland earthquakes is able to provide accurate and fast warnings. The average epicenter error is about 5 km and the processing time is about 15 seconds. The epicenter error is defined as the distance between the epicenter estimated by the EEW system and the epicenter estimated by man. The processing time is defined as the time difference between the time earthquakes occurred and the time the system issued warning. The CWB seismic network consist about 200 seismic stations. In some area of Taiwan the distance between each seismic station is about 10 km. It means that when an earthquake occurred the seismic P wave is able to propagate through 6 stations, which is the minimum number of required stations in the EEW system, within 20 km. If the latency of data transmitting is about 1 sec, the P-wave velocity is about 6 km per sec and we take 3-sec length time window to estimate earthquake magnitude, then the processing should be around 8 sec. In fact, however, the average processing time is larger than this figure. Because some outliers of P-wave onset picks may exist in the beginning of the earthquake occurrence, the Geiger's method we used in the EEW system for earthquake location is not stable. It usually takes more time to wait for enough number of good picks. In this study we used grid search method to improve the estimations of earthquake location. The MAXEL algorithm (Sheen et al., 2015, 2016) was tested in the EEW system by simulating historical earthquakes occurred in Taiwan. The results show the processing time can be reduced and the location accuracy is acceptable for EEW purpose.

  5. Fractals and Forecasting in Earthquakes and Finance

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.

    2011-12-01

    It is now recognized that Benoit Mandelbrot's fractals play a critical role in describing a vast range of physical and social phenomena. Here we focus on two systems, earthquakes and finance. Since 1942, earthquakes have been characterized by the Gutenberg-Richter magnitude-frequency relation, which in more recent times is often written as a moment-frequency power law. A similar relation can be shown to hold for financial markets. Moreover, a recent New York Times article, titled "A Richter Scale for the Markets" [1] summarized the emerging viewpoint that stock market crashes can be described with similar ideas as large and great earthquakes. The idea that stock market crashes can be related in any way to earthquake phenomena has its roots in Mandelbrot's 1963 work on speculative prices in commodities markets such as cotton [2]. He pointed out that Gaussian statistics did not account for the excessive number of booms and busts that characterize such markets. Here we show that both earthquakes and financial crashes can both be described by a common Landau-Ginzburg-type free energy model, involving the presence of a classical limit of stability, or spinodal. These metastable systems are characterized by fractal statistics near the spinodal. For earthquakes, the independent ("order") parameter is the slip deficit along a fault, whereas for the financial markets, it is financial leverage in place. For financial markets, asset values play the role of a free energy. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In the case of financial models, the probabilities are closely related to implied volatility, an important component of Black-Scholes models for stock valuations. [2] B. Mandelbrot, The variation of certain speculative prices, J. Business, 36, 294 (1963)

  6. An efficient repeating signal detector to investigate earthquake swarms

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2016-08-01

    Repetitive earthquake swarms have been recognized as key signatures in fluid injection induced seismicity, precursors to volcanic eruptions, and slow slip events preceding megathrust earthquakes. We investigate earthquake swarms by developing a Repeating Signal Detector (RSD), a computationally efficient algorithm utilizing agglomerative clustering to identify similar waveforms buried in years of seismic recordings using a single seismometer. Instead of relying on existing earthquake catalogs of larger earthquakes, RSD identifies characteristic repetitive waveforms by rapidly identifying signals of interest above a low signal-to-noise ratio and then grouping based on spectral and time domain characteristics, resulting in dramatically shorter processing time than more exhaustive autocorrelation approaches. We investigate seismicity in four regions using RSD: (1) volcanic seismicity at Mammoth Mountain, California, (2) subduction-related seismicity in Oaxaca, Mexico, (3) induced seismicity in Central Alberta, Canada, and (4) induced seismicity in Harrison County, Ohio. In each case, RSD detects a similar or larger number of earthquakes than existing catalogs created using more time intensive methods. In Harrison County, RSD identifies 18 seismic sequences that correlate temporally and spatially to separate hydraulic fracturing operations, 15 of which were previously unreported. RSD utilizes a single seismometer for earthquake detection which enables seismicity to be quickly identified in poorly instrumented regions at the expense of relying on another method to locate the new detections. Due to the smaller computation overhead and success at distances up to ~50 km, RSD is well suited for real-time detection of low-magnitude earthquake swarms with permanent regional networks.

  7. Measuring the effectiveness of earthquake forecasting in insurance strategies

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  8. Earthquake clustering in modern seismicity and its relationship with strong historical earthquakes around Beijing, China

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Main, Ian G.; Musson, Roger M. W.

    2017-11-01

    Beijing, China's capital city, is located in a typical intraplate seismic belt, with relatively high-quality instrumental catalogue data available since 1970. The Chinese historical earthquake catalogue contains six strong historical earthquakes of Ms ≥ 6 around Beijing, the earliest in 294 AD. This poses a significant potential hazard to one of the most densely populated and economically active parts of China. In some intraplate areas, persistent clusters of events associated with historical events can occur over centuries, for example, the ongoing sequence in the New Madrid zone of the eastern US. Here we will examine the evidence for such persistent clusters around Beijing. We introduce a metric known as the `seismic density index' that quantifies the degree of clustering of seismic energy release. For a given map location, this multi-dimensional index depends on the number of events, their magnitudes, and the distances to the locations of the surrounding population of earthquakes. We apply the index to modern instrumental catalogue data between 1970 and 2014, and identify six clear candidate zones. We then compare these locations to earthquake epicentre and seismic intensity data for the six largest historical earthquakes. Each candidate zone contains one of the six historical events, and the location of peak intensity is within 5 km or so of the reported epicentre in five of these cases. In one case—the great Ms 8 earthquake of 1679—the peak is closer to the area of strongest shaking (Intensity XI or more) than the reported epicentre. The present-day event rates are similar to those predicted by the modified Omori law but there is no evidence of ongoing decay in event rates. Accordingly, the index is more likely to be picking out the location of persistent weaknesses in the lithosphere. Our results imply zones of high seismic density index could be used in principle to indicate the location of unrecorded historical of palaeoseismic events, in China and

  9. Earthquake forecasts for the CSEP Japan experiment based on the RI algorithm

    NASA Astrophysics Data System (ADS)

    Nanjo, K. Z.

    2011-03-01

    An earthquake forecast testing experiment for Japan, the first of its kind, is underway within the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) under a controlled environment. Here we give an overview of the earthquake forecast models, based on the RI algorithm, which we have submitted to the CSEP Japan experiment. Models have been submitted to a total of 9 categories, corresponding to 3 testing classes (3 years, 1 year, and 3 months) and 3 testing regions. The RI algorithm is originally a binary forecast system based on the working assumption that large earthquakes are more likely to occur in the future at locations of higher seismicity in the past. It is based on simple counts of the number of past earthquakes, which is called the Relative Intensity (RI) of seismicity. To improve its forecast performance, we first expand the RI algorithm by introducing spatial smoothing. We then convert the RI representation from a binary system to a CSEP-testable model that produces forecasts for the number of earthquakes of predefined magnitudes. We use information on past seismicity to tune the parameters. The final submittal consists of 36 executable computer codes: 4 variants corresponding to different smoothing parameters for each of the 9 categories. They will help to elucidate which categories and which smoothing parameters are the most meaningful for the RI hypothesis. The main purpose of our participation in the experiment is to better understand the significance of the relative intensity of seismicity for earthquake forecastability in Japan.

  10. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  11. Remotely Triggered Earthquakes Recorded by EarthScope's Transportable Array and Regional Seismic Networks: A Case Study Of Four Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Cerda, I.; Linville, L.; Kilb, D. L.; Pankow, K. L.

    2013-05-01

    Changes in field stress required to trigger earthquakes have been classified in two basic ways: static and dynamic triggering. Static triggering occurs when an earthquake that releases accumulated strain along a fault stress loads a nearby fault. Dynamic triggering occurs when an earthquake is induced by the passing of seismic waves from a large mainshock located at least two or more fault lengths from the epicenter of the main shock. We investigate details of dynamic triggering using data collected from EarthScope's USArray and regional seismic networks located in the United States. Triggered events are identified using an optimized automated detector based on the ratio of short term to long term average (Antelope software). Following the automated processing, the flagged waveforms are individually analyzed, in both the time and frequency domains, to determine if the increased detection rates correspond to local earthquakes (i.e., potentially remotely triggered aftershocks). Here, we show results using this automated schema applied to data from four large, but characteristically different, earthquakes -- Chile (Mw 8.8 2010), Tokoku-Oki (Mw 9.0 2011), Baja California (Mw 7.2 2010) and Wells Nevada (Mw 6.0 2008). For each of our four mainshocks, the number of detections within the 10 hour time windows span a large range (1 to over 200) and statistically >20% of the waveforms show evidence of anomalous signals following the mainshock. The results will help provide for a better understanding of the physical mechanisms involved in dynamic earthquake triggering and will help identify zones in the continental U.S. that may be more susceptible to dynamic earthquake triggering.

  12. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    PubMed

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  13. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California

    PubMed Central

    Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.

    2011-01-01

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M≥4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M≥4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355

  14. Induced earthquake during the 2016 Kumamoto earthquake (Mw7.0): Importance of real-time shake monitoring for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Ogiso, M.

    2016-12-01

    Sequence of the 2016 Kumamoto earthquakes (Mw6.2 on April 14, Mw7.0 on April 16, and many aftershocks) caused a devastating damage at Kumamoto and Oita prefectures, Japan. During the Mw7.0 event, just after the direct S waves passing the central Oita, another M6 class event occurred there more than 80 km apart from the Mw7.0 event. The M6 event is interpreted as an induced earthquake; but it brought stronger shaking at the central Oita than that from the Mw7.0 event. We will discuss the induced earthquake from viewpoint of Earthquake Early Warning. In terms of ground shaking such as PGA and PGV, the Mw7.0 event is much smaller than those of the M6 induced earthquake at the central Oita (for example, 1/8 smaller at OIT009 station for PGA), and then it is easy to discriminate two events. However, PGD of the Mw7.0 is larger than that of the induced earthquake, and its appearance is just before the occurrence of the induced earthquake. It is quite difficult to recognize the induced earthquake from displacement waveforms only, because the displacement is strongly contaminated by that of the preceding Mw7.0 event. In many methods of EEW (including current JMA EEW system), magnitude is used for prediction of ground shaking through Ground Motion Prediction Equation (GMPE) and the magnitude is often estimated from displacement. However, displacement magnitude does not necessarily mean the best one for prediction of ground shaking, such as PGA and PGV. In case of the induced earthquake during the Kumamoto earthquake, displacement magnitude could not be estimated because of the strong contamination. Actually JMA EEW system could not recognize the induced earthquake. One of the important lessons we learned from eight years' operation of EEW is an issue of the multiple simultaneous earthquakes, such as aftershocks of the 2011 Mw9.0 Tohoku earthquake. Based on this lesson, we have proposed enhancement of real-time monitor of ground shaking itself instead of rapid estimation of

  15. The 1748 Montesa (south-east Spain) earthquake, a singular event

    NASA Astrophysics Data System (ADS)

    Buforn, Elisa; Udías, Agustín; Sanz de Galdeano, Carlos

    2015-04-01

    The Montesa earthquakes of 1748 took place in the south-east region of the Iberian Peninsula. Its location falls somewhat outside the seismic active region of southern Spain. The main shock took place on the 23 of March and was followed by a series of aftershocks, the largest on the 2 of April. Despite of the large number of documents with descriptions of the damage produced by this earthquake it has not been the object of a detailed seismological study. Documents described the damage in about 100 towns and villages over a wide area and it was felt in Valencia, Alcoy and Cartagena. The castle of Montesa was totally destroyed and the town of Xàtiva suffered heavy damage. The source region with seismic intensity IX extends about 15 km from Sellent to Enguera, along a possible fault of NE-SW direction. This is a singular event because it occurred in an area with an assigned low seismic risk where in the past very few large earthquakes have happened. This earthquake shows that a destructive earthquake may happen in the future in this region. The area affected by the earthquake has today a high industrial and tourist development.

  16. Heart attacks and the Newcastle earthquake.

    PubMed

    Dobson, A J; Alexander, H M; Malcolm, J A; Steele, P L; Miles, T A

    To test the hypothesis that stress generated by the Newcastle earthquake led to increased risk of heart attack and coronary death. A natural experiment. People living in the Newcastle and Lake Macquarie local government areas of New South Wales, Australia. At 10.27 a.m. on 28 December 1989 Newcastle was struck by an earthquake measuring 5.6 on the Richter scale. Myocardial infarction and coronary death defined by the criteria of the WHO MONICA Project and hospital admissions for coronary disease before and after the earthquake and in corresponding periods in previous years. Well established, concurrent data collection systems were used. There were six fatal myocardial infarctions and coronary deaths among people aged under 70 years after the earthquake in the period 28-31 December 1989. Compared with the average number of deaths at this time of year this was unusually high (P = 0.016). Relative risks for this four-day period were: fatal myocardial infarction and coronary death, 1.67 (95% confidence interval [Cl]: 0.72, 3.17); non-fatal definite myocardial infarction, 1.05 (95% Cl: 0.05, 2.22); non-fatal possible myocardial infarction, 1.34 (95% Cl: 0.67, 1.91); hospital admissions for myocardial infarction or other ischaemic heart disease, 1.27 (95% Cl: 0.83, 1.66). There was no evidence of increased risk during the following four months. The magnitude of increased risk of death was slightly less than that previously reported after earthquakes in Greece. The data provide weak evidence that acute emotional and physical stress may trigger myocardial infarction and coronary death.

  17. Copy number variants implicate cardiac function and development pathways in earthquake-induced stress cardiomyopathy.

    PubMed

    Lacey, Cameron J; Doudney, Kit; Bridgman, Paul G; George, Peter M; Mulder, Roger T; Zarifeh, Julie J; Kimber, Bridget; Cadzow, Murray J; Black, Michael A; Merriman, Tony R; Lehnert, Klaus; Bickley, Vivienne M; Pearson, John F; Cameron, Vicky A; Kennedy, Martin A

    2018-05-15

    The pathophysiology of stress cardiomyopathy (SCM), also known as takotsubo syndrome, is poorly understood. SCM usually occurs sporadically, often in association with a stressful event, but clusters of cases are reported after major natural disasters. There is some evidence that this is a familial condition. We have examined three possible models for an underlying genetic predisposition to SCM. Our primary study cohort consists of 28 women who suffered SCM as a result of two devastating earthquakes that struck the city of Christchurch, New Zealand, in 2010 and 2011. To seek possible underlying genetic factors we carried out exome analysis, genotyping array analysis, and array comparative genomic hybridization on these subjects. The most striking finding was the observation of a markedly elevated rate of rare, heterogeneous copy number variants (CNV) of uncertain clinical significance (in 12/28 subjects). Several of these CNVs impacted on genes of cardiac relevance including RBFOX1, GPC5, KCNRG, CHODL, and GPBP1L1. There is no physical overlap between the CNVs, and the genes they impact do not appear to be functionally related. The recognition that SCM predisposition may be associated with a high rate of rare CNVs offers a novel perspective on this enigmatic condition.

  18. Acute Myocardial Infarction and Stress Cardiomyopathy following the Christchurch Earthquakes

    PubMed Central

    Chan, Christina; Elliott, John; Troughton, Richard; Frampton, Christopher; Smyth, David; Crozier, Ian; Bridgman, Paul

    2013-01-01

    Background Christchurch, New Zealand, was struck by 2 major earthquakes at 4:36am on 4 September 2010, magnitude 7.1 and at 12:51pm on 22 February 2011, magnitude 6.3. Both events caused widespread destruction. Christchurch Hospital was the region's only acute care hospital. It remained functional following both earthquakes. We were able to examine the effects of the 2 earthquakes on acute cardiac presentations. Methods Patients admitted under Cardiology in Christchurch Hospital 3 week prior to and 5 weeks following both earthquakes were analysed, with corresponding control periods in September 2009 and February 2010. Patients were categorised based on diagnosis: ST elevation myocardial infarction, Non ST elevation myocardial infarction, stress cardiomyopathy, unstable angina, stable angina, non cardiac chest pain, arrhythmia and others. Results There was a significant increase in overall admissions (p<0.003), ST elevation myocardial infarction (p<0.016), and non cardiac chest pain (p<0.022) in the first 2 weeks following the early morning September earthquake. This pattern was not seen after the early afternoon February earthquake. Instead, there was a very large number of stress cardiomyopathy admissions with 21 cases (95% CI 2.6–6.4) in 4 days. There had been 6 stress cardiomyopathy cases after the first earthquake (95% CI 0.44–2.62). Statistical analysis showed this to be a significant difference between the earthquakes (p<0.05). Conclusion The early morning September earthquake triggered a large increase in ST elevation myocardial infarction and a few stress cardiomyopathy cases. The early afternoon February earthquake caused significantly more stress cardiomyopathy. Two major earthquakes occurring at different times of day differed in their effect on acute cardiac events. PMID:23844213

  19. Are there new findings in the search for ULF magnetic precursors to earthquakes?

    NASA Astrophysics Data System (ADS)

    Masci, F.; Thomas, J. N.

    2015-12-01

    Moore (1964) in a letter published in Nature reported disturbances in geomagnetic field data prior to the 27 March 1964 Alaska earthquake. After the publication of this report, many papers have shown magnetic changes preceding earthquakes. However, a causal relationship between preearthquake magnetic changes and impending earthquakes has never been demonstrated. As a consequence, after 50 years, magnetic disturbances in the geomagnetic field are still candidate precursory phenomena. Some researchers consider the investigation of ultra low frequency (ULF: 0.001-10 Hz) magnetic data the correct approach for identifying precursory signatures of earthquakes. Other researchers, instead, have recently reviewed many published ULF magnetic changes that preceded earthquakes and have shown that these are not actual precursors. The recent studies by Currie and Waters (2014) and Han et al. (2014) aim to provide relevant new findings in the search for ULF magnetic precursory signals. However, in order to contribute to science, alleged precursors must be shown to be valid and reproducible by objective testing. Here we will briefly discuss the state of the art in the search for ULF magnetic precursors, paying special attention to the recent findings of Currie and Waters (2014) and Han et al. (2014). We do not see in these two reports significant evidence that may support the observation of precursory signatures of earthquakes in ULF magnetic records.

  20. Lessons Learned about Best Practices for Communicating Earthquake Forecasting and Early Warning to Non-Scientific Publics

    NASA Astrophysics Data System (ADS)

    Sellnow, D. D.; Sellnow, T. L.

    2017-12-01

    Earthquake scientists are without doubt experts in understanding earthquake probabilities, magnitudes, and intensities, as well as the potential consequences of them to community infrastructures and inhabitants. One critical challenge these scientific experts face, however, rests with communicating what they know to the people they want to help. Helping scientists translate scientific information to non-scientists is something Drs. Tim and Deanna Sellnow have been committed to for decades. As such, they have compiled a host of data-driven best practices for communicating effectively to non-scientific publics about earthquake forecasting, probabilities, and warnings. In this session, they will summarize what they have learned as it may help earthquake scientists, emergency managers, and other key spokespersons share these important messages to disparate publics in ways that result in positive outcomes, the most important of which is saving lives.

  1. Earthquakes, September-October 1986

    USGS Publications Warehouse

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  2. Post earthquake recovery in natural gas systems--1971 San Fernando Earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, W.T. Jr.

    1983-01-01

    In this paper a concise summary of the post earthquake investigations for the 1971 San Fernando Earthquake is presented. The effects of the earthquake upon building and other above ground structures are briefly discussed. Then the damages and subsequent repairs in the natural gas systems are reported.

  3. The Earthquake Closet: Making Early-Warning Useful

    NASA Astrophysics Data System (ADS)

    Wyss, M.; Trendafiloski, G.

    2009-12-01

    Early-warning of approaching strong shaking that could have fatal consequences is a research field that has made great progress. It makes it possible to reduce the impact on dangerous processes in critical facilities and on trains. However, its potential to save lives has a serious Achilles heel: The time for getting to safety is five to 10 seconds only, in many cities. Occupants of upper floors cannot get out of their buildings and narrow streets are not a safe place in strong earthquakes for people who might be able to exit. Thus, only about 10% of a city’s population can benefit from early-warnings, unless they have access to their own earthquake closet that is strong enough to remain intact in a collapsing building. Such an Earthquake Protection Unit (EPU) may be installed in the structurally strongest part of an existing apartment at low cost. In new constructions, we propose that an earthquake shelter be constructed for each floor, large enough to accommodate all occupants of that floor. These types of EPU should be constructed on top of each other, forming a strong tower, next to the elevator shaft and the staircase, at the center of the building. If an EPU with structural properties equivalent to an E-class building is placed into a building of B-class in South America, for example, we estimate that the chances of surviving shaking of intensity VII is about 30,000 times better inside the closet. The probability of escaping injury inside compared to outside we estimate as about 1,500 times better. Educating the population regarding the usefulness of EPUs will be essential, and P-waves can be used as the early warning signal. The owner of an earthquake closet can easily be motivated to take protective measures, when these involve simply to step into his closet, rather than attempting to exit from the building by running down many flights of stairs. Our intention is to start a discussion how best to construct EPUs and how to introduce legislation that will

  4. Earthquakes; July-August, 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California. 

  5. Factors Affecting Household Adoption of an Evacuation Plan in American Samoa after the 2009 Earthquake and Tsunami

    PubMed Central

    Gregg, Chris E; Richards, Kasie; Sorensen, Barbara Vogt; Wang, Liang

    2013-01-01

    American Samoa is still recovering from the debilitating consequences of the September 29, 2009 tsunami. Little is known about current household preparedness in American Samoa for future earthquakes and tsunamis. Thus, this study sought to enumerate the number of households with an earthquake and tsunami evacuation plan and to identify predictors of having a household evacuation plan through a post-tsunami survey conducted in July 2011. Members of 300 households were interviewed in twelve villages spread across regions of the principle island of Tutuila. Multiple logistic regression showed that being male, having lived in one's home for < 30 years, and having a friend who suffered damage to his or her home during the 2009 tsunami event increased the likelihood of having a household evacuation plan. The prevalence of tsunami evacuation planning was 35% indicating that survivors might feel that preparation is not necessary given effective adaptive responses during the 2009 event. Results suggest that emergency planners and public health officials should continue with educational outreach to families to spread awareness around the importance of developing plans for future earthquakes and tsunamis to help mitigate human and structural loss from such natural disasters. Additional research is needed to better understand the linkages between pre-event planning and effective evacuation responses as were observed in the 2009 events. PMID:24349889

  6. Factors affecting household adoption of an evacuation plan in American Samoa after the 2009 earthquake and tsunami.

    PubMed

    Apatu, Emma J I; Gregg, Chris E; Richards, Kasie; Sorensen, Barbara Vogt; Wang, Liang

    2013-08-01

    American Samoa is still recovering from the debilitating consequences of the September 29, 2009 tsunami. Little is known about current household preparedness in American Samoa for future earthquakes and tsunamis. Thus, this study sought to enumerate the number of households with an earthquake and tsunami evacuation plan and to identify predictors of having a household evacuation plan through a post-tsunami survey conducted in July 2011. Members of 300 households were interviewed in twelve villages spread across regions of the principle island of Tutuila. Multiple logistic regression showed that being male, having lived in one's home for < 30 years, and having a friend who suffered damage to his or her home during the 2009 tsunami event increased the likelihood of having a household evacuation plan. The prevalence of tsunami evacuation planning was 35% indicating that survivors might feel that preparation is not necessary given effective adaptive responses during the 2009 event. Results suggest that emergency planners and public health officials should continue with educational outreach to families to spread awareness around the importance of developing plans for future earthquakes and tsunamis to help mitigate human and structural loss from such natural disasters. Additional research is needed to better understand the linkages between pre-event planning and effective evacuation responses as were observed in the 2009 events.

  7. Improvements of the offshore earthquake locations in the Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Chen, Ta-Yi; Hsu, Hsin-Chih

    2017-04-01

    Since 2014 the Earthworm Based Earthquake Alarm Reporting (eBEAR) system has been operated and been used to issue warnings to schools. In 2015 the system started to provide warnings to the public in Taiwan via television and the cell phone. Online performance of the eBEAR system indicated that the average reporting times afforded by the system are approximately 15 and 28 s for inland and offshore earthquakes, respectively. The eBEAR system in average can provide more warning time than the current EEW system (3.2 s and 5.5 s for inland and offshore earthquakes, respectively). However, offshore earthquakes were usually located poorly because only P-wave arrivals were used in the eBEAR system. Additionally, in the early stage of the earthquake early warning system, only fewer stations are available. The poor station coverage may be a reason to answer why offshore earthquakes are difficult to locate accurately. In the Geiger's inversion procedure of earthquake location, we need to put an initial hypocenter and origin time into the location program. For the initial hypocenter, we defined some test locations on the offshore area instead of using the average of locations from triggered stations. We performed 20 programs concurrently running the Geiger's method with different pre-defined initial position to locate earthquakes. We assume that if the program with the pre-defined initial position is close to the true earthquake location, during the iteration procedure of the Geiger's method the processing time of this program should be less than others. The results show that using pre-defined locations for trial-hypocenter in the inversion procedure is able to improve the accurate of offshore earthquakes. Especially for EEW system, in the initial stage of the EEW system, only use 3 or 5 stations to locate earthquakes may lead to bad results because of poor station coverage. In this study, the pre-defined trial-locations provide a feasible way to improve the estimations of

  8. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    NASA Astrophysics Data System (ADS)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  9. Machine Learning Seismic Wave Discrimination: Application to Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Li, Zefeng; Meier, Men-Andrin; Hauksson, Egill; Zhan, Zhongwen; Andrews, Jennifer

    2018-05-01

    Performance of earthquake early warning systems suffers from false alerts caused by local impulsive noise from natural or anthropogenic sources. To mitigate this problem, we train a generative adversarial network (GAN) to learn the characteristics of first-arrival earthquake P waves, using 300,000 waveforms recorded in southern California and Japan. We apply the GAN critic as an automatic feature extractor and train a Random Forest classifier with about 700,000 earthquake and noise waveforms. We show that the discriminator can recognize 99.2% of the earthquake P waves and 98.4% of the noise signals. This state-of-the-art performance is expected to reduce significantly the number of false triggers from local impulsive noise. Our study demonstrates that GANs can discover a compact and effective representation of seismic waves, which has the potential for wide applications in seismology.

  10. Landslide Distribution, Damage and Land Use Interactions During the 2004 Chuetsu Earthquake

    NASA Astrophysics Data System (ADS)

    Sidle, R. C.; Trandafir, A. C.; Kamai, T.

    2005-05-01

    A series of earthquakes struck Niigata Prefecture, Japan, on 23 October 2004 killing about 40 people and injuring about 3000. These earthquakes were characterized by a shallow focal depth (13 km) that generated strong levels of ground motion, resulting in extensive damage and thousands of landslides throughout the region. Most landslides on natural slopes occurred in the regional geological structure consisting of sandy siltstone and thin-bedded alternations of sandstone and siltstone. Earthquakes exacerbate such potential instabilities by the ground motion induced and the enhancement of pore water pressure in wet regoliths. The three strongest earthquakes occurred within a period of less than 40 minutes, and had sequential magnitudes (JMA) of 6.8, 6.3, and 6.5. The highest density of landslides (12/km2) was mapped within a 2.9 km radius of the M6.5 epicenter near Yamakoshi village; about 4 times higher density compared to the other epicenters located to the east and west. This higher density may be a consequence of the cumulative shaking effects associated with the two earlier earthquakes of M6.8 and 6.5, in addition to the topographic and geologic factors controlling the stability of the region. Roads, residential fills, agricultural terraces on hillslopes, and other earthworks increased the susceptibility of sites to slope failure. Numerous earthquake-induced failures in terraces and adjacent hillslopes around rice paddy fields occurred near Yamakoshi village. A housing development in Nagaoka city constructed on an old earthflow suffered from severe damage to fill slopes during the earthquake. Nearly saturated conditions in these deep fills together with poor drainage systems contributed to the landslide damages. Clearly, land use activities in rural and urban areas exacerbated the extent of earthquake-triggered landslides.

  11. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    NASA Astrophysics Data System (ADS)

    Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.

    2012-11-01

    results in a margin stratigraphy of minor MTDs compared to the turbidite-system deposits. In contrast, the MTDs and turbidites are equally intermixed on basin floors along passive margins with a mud-rich continental slope, such as the northern Gulf of Mexico. Great earthquakes also result in characteristic seismo-turbidite lithology. Along the Cascadia margin, the number and character of multiple coarse pulses for correlative individual turbidites generally remain constant both upstream and downstream in different channel systems for 600 km along the margin. This suggests that the earthquake shaking or aftershock signature is normally preserved, for the stronger (Mw ≥ 9) Cascadia earthquakes. In contrast, the generally weaker (Mw = or <8) California earthquakes result in upstream simple fining-up turbidites in single tributary canyons and channels; however, downstream mainly stacked turbidites result from synchronously triggered multiple turbidity currents that deposit in channels below confluences of the tributaries. Consequently, both downstream channel confluences and the strongest (Mw ≥ 9) great earthquakes contribute to multi-pulsed and stacked turbidites that are typical for seismo-turbidites generated by a single great earthquake. Earthquake triggering and multi-pulsed or stacked turbidites also become an alternative explanation for amalgamated turbidite beds in active tectonic margins, in addition to other classic explanations. The sedimentologic characteristics of turbidites triggered by great earthquakes along the Cascadia and northern California margins provide criteria to help distinguish seismo-turbidites in other active tectonic margins.

  12. Prospective Evaluation of the Global Earthquake Activity Rate Model (GEAR1) Earthquake Forecast: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Strader, Anne; Schorlemmer, Danijel; Beutin, Thomas

    2017-04-01

    The Global Earthquake Activity Rate Model (GEAR1) is a hybrid seismicity model, constructed from a loglinear combination of smoothed seismicity from the Global Centroid Moment Tensor (CMT) earthquake catalog and geodetic strain rates (Global Strain Rate Map, version 2.1). For the 2005-2012 retrospective evaluation period, GEAR1 outperformed both parent strain rate and smoothed seismicity forecasts. Since 1. October 2015, GEAR1 has been prospectively evaluated by the Collaboratory for the Study of Earthquake Predictability (CSEP) testing center. Here, we present initial one-year test results of the GEAR1, GSRM and GSRM2.1, as well as localized evaluation of GEAR1 performance. The models were evaluated on the consistency in number (N-test), spatial (S-test) and magnitude (M-test) distribution of forecasted and observed earthquakes, as well as overall data consistency (CL-, L-tests). Performance at target earthquake locations was compared between models using the classical paired T-test and its non-parametric equivalent, the W-test, to determine if one model could be rejected in favor of another at the 0.05 significance level. For the evaluation period from 1. October 2015 to 1. October 2016, the GEAR1, GSRM and GSRM2.1 forecasts pass all CSEP likelihood tests. Comparative test results show statistically significant improvement of GEAR1 performance over both strain rate-based forecasts, both of which can be rejected in favor of GEAR1. Using point process residual analysis, we investigate the spatial distribution of differences in GEAR1, GSRM and GSRM2 model performance, to identify regions where the GEAR1 model should be adjusted, that could not be inferred from CSEP test results. Furthermore, we investigate whether the optimal combination of smoothed seismicity and strain rates remains stable over space and time.

  13. Earthquakes; March-April 1975

    USGS Publications Warehouse

    Person, W.J.

    1975-01-01

    There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971. 

  14. The January 2006 Volcanic-Tectonic Earthquake Swarm at Mount Martin, Alaska

    USGS Publications Warehouse

    Dixon, James P.; Power, John A.

    2009-01-01

    On January 8, 2006, a swarm of volcanic-tectonic earthquakes began beneath Mount Martin at the southern end of the Katmai volcanic cluster. This was the first recorded swarm at Mount Martin since continuous seismic monitoring began in 1996. The number of located earthquakes increased during the next four days, reaching a peak on January 11. For the next two days, the seismic activity decreased, and on January 14, the number of events increased to twice the previous day's total. Following this increase in activity, seismicity declined, returning to background levels by the end of the month. The Alaska Volcano Observatory located 860 earthquakes near Mount Martin during January 2006. No additional signs of volcanic unrest were noted in association with this earthquake swarm. The earthquakes in the Mount Martin swarm, relocated using the double difference technique, formed an elongated cluster dipping to the southwest. Focal mechanisms beneath Mount Martin show a mix of normal, thrust, and strike-slip solutions, with normal focal mechanisms dominating. For earthquakes more than 1 km from Mount Martin, all focal mechanisms showed normal faulting. The calculated b-value for the Mount Martin swarm is 0.98 and showed no significant change before, during, or after the swarm. The triggering mechanism for the Mount Martin swarm is unknown. The time-history of earthquake occurrence is indicative of a volcanic cause; however, there were no low-frequency events or observations, such as increased steaming associated with the swarm. During the swarm, there was no change in the b-value, and the distribution and type of focal mechanisms were similar to those in the period before the anomalous activity. The short duration of the swarm, the similarity in observed focal mechanisms, and the lack of additional signs of unrest suggest this swarm did not result from a large influx of magma within the shallow crust beneath Mount Martin.

  15. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence

  16. Protecting your family from earthquakes: The seven steps to earthquake safety

    USGS Publications Warehouse

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  17. Fear based Education or Curiosity based Education as an Example of Earthquake and Natural Disaster Education: Results of Statistical Study in Primary Schools in Istanbul-Turkey

    NASA Astrophysics Data System (ADS)

    Ozcep, T.; Ozcep, F.

    2012-04-01

    Natural disaster reduction focuses on the urgent need for prevention activities to reduce loss of life, damage to property, infrastructure and environment, and the social and economic disruption caused by natural hazards. One of the most important factors in reduction of the potential damage of earthquakes is trained manpower. To understanding the causes of earthquakes and other natural phenomena (landslides, avalanches, floods, volcanoes, etc.) is one of the pre-conditions to show a conscious behavior. The aim of the study is to analysis and to investigate, how earthquakes and other natural phenomena are perceived by the students and the possible consequences of this perception, and their effects of reducing earthquake damage. One of the crucial questions is that is our education system fear or curiosity based education system? Effects of the damages due to earthquakes have led to look like a fear subject. In fact, due to the results of the effects, the earthquakes are perceived scary phenomena. In the first stage of the project, the learning (or perception) levels of earthquakes and other natural disasters for the students of primary school are investigated with a survey. Aim of this survey study of earthquakes and other natural phenomena is that have the students fear based or curiosity based approaching to the earthquakes and other natural events. In the second stage of the project, the path obtained by the survey are evaluated with the statistical point of approach. A questionnaire associated with earthquakes and natural disasters are applied to primary school students (that total number of them is approximately 700 pupils) to measure the curiosity and/or fear levels. The questionnaire consists of 17 questions related to natural disasters. The questions are: "What is the Earthquake ?", "What is power behind earthquake?", "What is the mental response during the earthquake ?", "Did we take lesson from earthquake's results ?", "Are you afraid of earthquake

  18. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  19. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    PubMed Central

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  20. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part A, Prehistoric earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax, the maximum earthquake magnitude thought to be possible within a specified geographic region. This report is Part A of an Open-File Report that describes the construction of a global catalog of moderate to large earthquakes, from which one can estimate Mmax for most of the Central and Eastern United States and adjacent Canada. The catalog and Mmax estimates derived from it were used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. This Part A discusses prehistoric earthquakes that occurred in eastern North America, northwestern Europe, and Australia, whereas a separate Part B deals with historical events.

  1. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    USGS Publications Warehouse

    Kilb, Debi; Gomberg, J.

    1999-01-01

    We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

  2. New Field Observations About 19 August 1966 Varto earthquake, Eastern Turkey

    NASA Astrophysics Data System (ADS)

    Gurboga, S.

    2013-12-01

    Some destructive earthquakes in the past and even in the recent have several mysteries. For example, magnitude, epicenter location, faulting type and source fault of an earthquake have not been detected yet. One of these mysteries events is 19 August 1966 Varto earthquake in Turkey. 19 August 1966 Varto earthquake (Ms = 6.8) was an extra ordinary event at the 40 km east of junction between NAFS and EAFS which are two seismogenic system and active structures shaping the tectonics of Turkey. This earthquake sourced from Varto fault zone which are approximately 4 km width and 43 km length. It consists of faults which have parallel to sub-parallel, closely-spaced, north and south-dipping up to 85°-88° dip amount. Although this event has 6.8 (Ms) magnitude that is big enough to create a surface rupture, there was no clear surface deformation had been detected. This creates the controversial issue about the source fault and the mechanism of the earthquake. According to Wallace (1968) the type of faulting is right-lateral. On the other hand, McKenzie (1972) proposed right-lateral movement with thrust component by using the focal mechanism solution. The recent work done by Sançar et al. (2011) claimed that type of faulting is pure right-lateral strike-slip and there is no any surface rupture during the earthquake. Furthermore, they suggested that Varto segment in the Varto Fault Zone was most probably not broken in 1966 earthquake. This study is purely focused on the field geology and trenching survey for the investigation of 1966 Varto earthquake. Four fault segments have been mapped along the Varto fault zone: Varto, Sazlica, Leylekdağ and Çayçati segments. Because of the thick volcanic cover on the area around Varto, surface rupture has only been detected by trenching survey. Two trenching survey have been applied along the Yayikli and Ağaçalti faults in the Varto fault zone. Consequently, detailed geological work in the field and trenching survey indicate that

  3. Compiling an earthquake catalogue for the Arabian Plate, Western Asia

    NASA Astrophysics Data System (ADS)

    Deif, Ahmed; Al-Shijbi, Yousuf; El-Hussain, Issa; Ezzelarab, Mohamed; Mohamed, Adel M. E.

    2017-10-01

    The Arabian Plate is surrounded by regions of relatively high seismicity. Accounting for this seismicity is of great importance for seismic hazard and risk assessments, seismic zoning, and land use. In this study, a homogenous earthquake catalogue of moment-magnitude (Mw) for the Arabian Plate is provided. The comprehensive and homogenous earthquake catalogue provided in the current study spatially involves the entire Arabian Peninsula and neighboring areas, covering all earthquake sources that can generate substantial hazard for the Arabian Plate mainland. The catalogue extends in time from 19 to 2015 with a total number of 13,156 events, of which 497 are historical events. Four polygons covering the entire Arabian Plate were delineated and different data sources including special studies, local, regional and international catalogues were used to prepare the earthquake catalogue. Moment magnitudes (Mw) that provided by original sources were given the highest magnitude type priority and introduced to the catalogues with their references. Earthquakes with magnitude differ from Mw were converted into this scale applying empirical relationships derived in the current or in previous studies. The four polygons catalogues were included in two comprehensive earthquake catalogues constituting the historical and instrumental periods. Duplicate events were identified and discarded from the current catalogue. The present earthquake catalogue was declustered in order to contain only independent events and investigated for the completeness with time of different magnitude spans.

  4. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  5. Identification of Deep Earthquakes

    DTIC Science & Technology

    2010-09-01

    discriminants that will reliably separate small, crustal earthquakes (magnitudes less than about 4 and depths less than about 40 to 50 km) from small...characteristics on discrimination plots designed to separate nuclear explosions from crustal earthquakes. Thus, reliably flagging these small, deep events is...Further, reliably identifying subcrustal earthquakes will allow us to eliminate deep events (previously misidentified as crustal earthquakes) from

  6. The impact of the Canterbury earthquakes on successful school leaving for adolescents.

    PubMed

    Beaglehole, Ben; Bell, Caroline; Frampton, Christopher; Moor, Stephanie

    2017-02-01

    To examine the impact of the Canterbury earthquakes on the important adolescent transition period of school leaving. Local and national data on school leaving age, attainment of National Certificate of Educational Achievement (NCEA) standards, and school rolls (total registered students for schools) were examined to clarify long-term trends and delineate these from any impacts of the Canterbury earthquakes.  Results: Despite concerns about negative impacts, there was no evidence for increased school disengagement or poorer academic performance by students as a consequence of the earthquakes. Although there may have been negative effects for a minority, the possibility of post-disaster growth and resilience being the norm for the majority meant that negative effects on school leaving were not observed following the earthquakes. A range of post-disaster responses may have mitigated adverse effects on the adolescent population. Implications for Public Health: Overall long-term negative effects are unlikely for the affected adolescent population. The results also indicate that similar populations exposed to disasters in other settings are likely to do well in the presence of a comprehensive post-disaster response. © 2016 The Authors.

  7. Fault lubrication during earthquakes.

    PubMed

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  8. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    PubMed

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  9. Earthquake Early Warning: User Education and Designing Effective Messages

    NASA Astrophysics Data System (ADS)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

  10. Constraints on the source parameters of low-frequency earthquakes on the San Andreas Fault

    USGS Publications Warehouse

    Thomas, Amanda M.; Beroza, Gregory C.; Shelly, David R.

    2016-01-01

    Low-frequency earthquakes (LFEs) are small repeating earthquakes that occur in conjunction with deep slow slip. Like typical earthquakes, LFEs are thought to represent shear slip on crustal faults, but when compared to earthquakes of the same magnitude, LFEs are depleted in high-frequency content and have lower corner frequencies, implying longer duration. Here we exploit this difference to estimate the duration of LFEs on the deep San Andreas Fault (SAF). We find that the M ~ 1 LFEs have typical durations of ~0.2 s. Using the annual slip rate of the deep SAF and the average number of LFEs per year, we estimate average LFE slip rates of ~0.24 mm/s. When combined with the LFE magnitude, this number implies a stress drop of ~104 Pa, 2 to 3 orders of magnitude lower than ordinary earthquakes, and a rupture velocity of 0.7 km/s, 20% of the shear wave speed. Typical earthquakes are thought to have rupture velocities of ~80–90% of the shear wave speed. Together, the slow rupture velocity, low stress drops, and slow slip velocity explain why LFEs are depleted in high-frequency content relative to ordinary earthquakes and suggest that LFE sources represent areas capable of relatively higher slip speed in deep fault zones. Additionally, changes in rheology may not be required to explain both LFEs and slow slip; the same process that governs the slip speed during slow earthquakes may also limit the rupture velocity of LFEs.

  11. Emergency surgical care delivery in post-earthquake Haiti: Partners in Health and Zanmi Lasante experience.

    PubMed

    McIntyre, Thomas; Hughes, Christopher D; Pauyo, Thierry; Sullivan, Stephen R; Rogers, Selwyn O; Raymonville, Maxi; Meara, John G

    2011-04-01

    The earthquake that struck Haiti on 12 January 2010 caused significant devastation to both the country and the existing healthcare infrastructure in both urban and rural areas. Most hospital and health care facilities in Port-au-Prince and the surrounding areas were significantly damaged or destroyed. Consequently, large groups of Haitians fled Port-au-Prince for rural areas to seek emergency medical and surgical care. In partnership with the Haitian Ministry of Health, Partners in Health (PIH) and Zanmi Lasante (ZL) have developed and maintained a network of regional and district hospitals in rural Haiti for over twenty-five years. This PIH/ZL system was ideally situated to accommodate the increased need for emergent surgical care in the immediate quake aftermath. The goal of the present study was to provide a cross-sectional assessment of surgical need and care delivery across PIH/ZL facilities after the earthquake in Haiti. We conducted a retrospective review of hospital case logs and operative records over the course of three weeks immediately following the earthquake. Roughly 3,000 patients were seen at PIH/ZL sites by a combination of Haitian and international surgical teams. During that period 513 emergency surgical cases were logged. Other than wound debridement, the most commonly performed procedure was fixation of long bone fractures, which constituted approximately one third of all surgical procedures. There was a significant demand for emergent surgical care after the earthquake in Haiti. The PIH/ZL hospital system played a critical role in addressing this acutely increased burden of surgical disease, and it allowed for large numbers of Haitians to receive needed surgical services. Our experiences reinforce that access to essential surgery is an essential pillar in public health.

  12. One research from turkey on groundwater- level changes related earthquake

    NASA Astrophysics Data System (ADS)

    Kirmizitas, H.; Göktepe, G.

    2003-04-01

    Groundwater levels are recorded by limnigraphs in drilling wells in order to determine groundwater potential accurately and reliable under hydrogeological studies in Turkey State Haydraulic Works (DSI) set the limnigraphs to estimate mainly groundwater potential. Any well is drilled to determine and to obtain data on water level changes related earthquake up today. The main purpose of these studies are based on groundwater potential and to expose the hydrodynamic structure of an aquifer. In this study, abnormal oscillations, water rising and water drops were observed on graphs which is related with water level changes in groundwater. These observations showed that, some earthquakes has been effective on water level changes. There is a distance ranging to 2000 km between this epicentral and water wells. Water level changes occur in groundwater bearing layers that could be consisting of grained materials such as, alluvium or consolidated rocks such as, limestones. The biggest water level change is ranging to 1,48 m on diagrams and it is recorded as oscillation movement. Water level changes related earthquake are observed in different types of movements below in this research. 1-Rise-drop oscillation changes on same point. 2-Water level drop in certain periods or permanent periods after earthquakes. 3-Water level rise in certain periods or permanent periods after earthquakes. (For example, during Gölcük Earthquake with magnitude of 7.8 on August, 17, 1999 one artesian occured in DSI well ( 49160 numbered ) in Adapazari, Dernekkiri Village. Groundwater level changes might easily be changed because of atmosferic pressure that comes in first range, precipitation, irrigation or water pumping. Owing to relate groundwater level changes with earthquake on any time, such changes should be observed accurately, carefully and at right time. Thus, first of all, the real reason of this water level changes must be determined From 1970 to 2001 many earthquakes occured in Turkey

  13. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  14. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  15. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  16. Crowdsourced earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  17. Using Smartphones to Detect Earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  18. The Global Earthquake Model - Past, Present, Future

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Stein, Ross

    2014-05-01

    The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange. Sharing of data and risk information, best practices, and approaches across the globe are key to assessing risk more effectively. Through consortium driven global projects, open-source IT development and collaborations with more than 10 regions, leading experts are developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. The year 2013 has seen the completion of ten global data sets or components addressing various aspects of earthquake hazard and risk, as well as two GEM-related, but independently managed regional projects SHARE and EMME. Notably, the International Seismological Centre (ISC) led the development of a new ISC-GEM global instrumental earthquake catalogue, which was made publicly available in early 2013. It has set a new standard for global earthquake catalogues and has found widespread acceptance and application in the global earthquake community. By the end of 2014, GEM's OpenQuake computational platform will provide the OpenQuake hazard/risk assessment software and integrate all GEM data and information products. The public release of OpenQuake is planned for the end of this 2014, and will comprise the following datasets and models: • ISC-GEM Instrumental Earthquake Catalogue (released January 2013) • Global Earthquake History Catalogue [1000-1903] • Global Geodetic Strain Rate Database and Model • Global Active Fault Database • Tectonic Regionalisation Model • Global Exposure Database • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerabilities Database • Socio-Economic Vulnerability and Resilience Indicators • Seismic

  19. On the Diurnal Periodicity of Representative Earthquakes in Greece: Comparison of Data from Different Observation Systems

    NASA Astrophysics Data System (ADS)

    Desherevskii, A. V.; Sidorin, A. Ya.

    2017-12-01

    Due to the initiation of the Hellenic Unified Seismic Network (HUSN) in late 2007, the quality of observation significantly improved by 2011. For example, the representative magnitude level considerably has decreased and the number of annually recorded events has increased. The new observational system highly expanded the possibilities for studying regularities in seismicity. In view of this, the authors revisited their studies of the diurnal periodicity of representative earthquakes in Greece that was revealed earlier in the earthquake catalog before 2011. We use 18 samples of earthquakes of different magnitudes taken from the catalog of Greek earthquakes from 2011 to June 2016 to derive a series of the number of earthquakes for each of them and calculate its average diurnal course. To increase the reliability of the results, we compared the data for two regions. With a high degree of statistical significance, we have obtained that no diurnal periodicity can be found for strongly representative earthquakes. This finding differs from the estimates obtained earlier from an analysis of the catalog of earthquakes at the same area for 1995-2004 and 2005-2010, i.e., before the initiation of the Hellenic Unified Seismic Network. The new results are consistent with the hypothesis of noise discrimination (observational selection) explaining the cause of the diurnal variation of earthquakes with different sensitivity of the seismic network in daytime and nighttime periods.

  20. Tests of remote aftershock triggering by small mainshocks using Taiwan's earthquake catalog

    NASA Astrophysics Data System (ADS)

    Peng, W.; Toda, S.

    2014-12-01

    To understand earthquake interaction and forecast time-dependent seismic hazard, it is essential to evaluate which stress transfer, static or dynamic, plays a major role to trigger aftershocks and subsequent mainshocks. Felzer and Brodsky focused on small mainshocks (2≤M<3) and their aftershocks, and then argued that only dynamic stress change brings earthquake-to-earthquake triggering, whereas Richards-Dingers et al. (2010) claimed that those selected small mainshock-aftershock pairs were not earthquake-to-earthquake triggering but simultaneous occurrence of independent aftershocks following a larger earthquake or during a significant swarm sequence. We test those hypotheses using Taiwan's earthquake catalog by taking the advantage of lacking any larger event and the absence of significant seismic swarm typically seen with active volcano. Using Felzer and Brodsky's method and their standard parameters, we only found 14 mainshock-aftershock pairs occurred within 20 km distance in Taiwan's catalog from 1994 to 2010. Although Taiwan's catalog has similar number of earthquakes as California's, the number of pairs is about 10% of the California catalog. It may indicate the effect of no large earthquakes and no significant seismic swarm in the catalog. To fully understand the properties in the Taiwan's catalog, we loosened the screening parameters to earn more pairs and then found a linear aftershock density with a power law decay of -1.12±0.38 that is very similar to the one in Felzer and Brodsky. However, none of those mainshock-aftershock pairs were associated with a M7 rupture event or M6 events. To find what mechanism controlled the aftershock density triggered by small mainshocks in Taiwan, we randomized earthquake magnitude and location. We then found that those density decay in a short time period is more like a randomized behavior than mainshock-aftershock triggering. Moreover, 5 out of 6 pairs were found in a swarm-like temporal seismicity rate increase

  1. The effect of earthquake on architecture geometry with non-parallel system irregularity configuration

    NASA Astrophysics Data System (ADS)

    Teddy, Livian; Hardiman, Gagoek; Nuroji; Tudjono, Sri

    2017-12-01

    Indonesia is an area prone to earthquake that may cause casualties and damage to buildings. The fatalities or the injured are not largely caused by the earthquake, but by building collapse. The collapse of the building is resulted from the building behaviour against the earthquake, and it depends on many factors, such as architectural design, geometry configuration of structural elements in horizontal and vertical plans, earthquake zone, geographical location (distance to earthquake center), soil type, material quality, and construction quality. One of the geometry configurations that may lead to the collapse of the building is irregular configuration of non-parallel system. In accordance with FEMA-451B, irregular configuration in non-parallel system is defined to have existed if the vertical lateral force-retaining elements are neither parallel nor symmetric with main orthogonal axes of the earthquake-retaining axis system. Such configuration may lead to torque, diagonal translation and local damage to buildings. It does not mean that non-parallel irregular configuration should not be formed on architectural design; however the designer must know the consequence of earthquake behaviour against buildings with irregular configuration of non-parallel system. The present research has the objective to identify earthquake behaviour in architectural geometry with irregular configuration of non-parallel system. The present research was quantitative with simulation experimental method. It consisted of 5 models, where architectural data and model structure data were inputted and analyzed using the software SAP2000 in order to find out its performance, and ETAB2015 to determine the eccentricity occurred. The output of the software analysis was tabulated, graphed, compared and analyzed with relevant theories. For areas of strong earthquake zones, avoid designing buildings which wholly form irregular configuration of non-parallel system. If it is inevitable to design a

  2. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  3. Complex earthquake rupture and local tsunamis

    USGS Publications Warehouse

    Geist, E.L.

    2002-01-01

    In contrast to far-field tsunami amplitudes that are fairly well predicted by the seismic moment of subduction zone earthquakes, there exists significant variation in the scaling of local tsunami amplitude with respect to seismic moment. From a global catalog of tsunami runup observations this variability is greatest for the most frequently occuring tsunamigenic subduction zone earthquakes in the magnitude range of 7 < Mw < 8.5. Variability in local tsunami runup scaling can be ascribed to tsunami source parameters that are independent of seismic moment: variations in the water depth in the source region, the combination of higher slip and lower shear modulus at shallow depth, and rupture complexity in the form of heterogeneous slip distribution patterns. The focus of this study is on the effect that rupture complexity has on the local tsunami wave field. A wide range of slip distribution patterns are generated using a stochastic, self-affine source model that is consistent with the falloff of far-field seismic displacement spectra at high frequencies. The synthetic slip distributions generated by the stochastic source model are discretized and the vertical displacement fields from point source elastic dislocation expressions are superimposed to compute the coseismic vertical displacement field. For shallow subduction zone earthquakes it is demonstrated that self-affine irregularities of the slip distribution result in significant variations in local tsunami amplitude. The effects of rupture complexity are less pronounced for earthquakes at greater depth or along faults with steep dip angles. For a test region along the Pacific coast of central Mexico, peak nearshore tsunami amplitude is calculated for a large number (N = 100) of synthetic slip distribution patterns, all with identical seismic moment (Mw = 8.1). Analysis of the results indicates that for earthquakes of a fixed location, geometry, and seismic moment, peak nearshore tsunami amplitude can vary by a

  4. Shaking intensity from injection-induced versus tectonic earthquakes in the central-eastern United States

    USGS Publications Warehouse

    Hough, Susan E.

    2015-01-01

    Although instrumental recordings of earthquakes in the central and eastern United States (CEUS) remain sparse, the U. S. Geological Survey's “Did you feel it?” (DYFI) system now provides excellent characterization of shaking intensities caused by induced and tectonic earthquakes. Seventeen CEUS events are considered between 2013 and 2015. It is shown that for 15 events, observed intensities at epicentral distances greater than ≈ 10 km are lower than expected given a published intensity-prediction equation for the region. Using simple published relations among intensity, magnitude, and stress drop, the results suggest that 15 of the 17 events have low stress drop. For those 15 events, intensities within ≈ 10-km epicentral distance are closer to predicted values, which can be explained as a consequence of relatively shallow source depths. The results suggest that those 15 events, most of which occurred in areas where induced earthquakes have occurred previously, were likely induced. Although moderate injection-induced earthquakes in the central and eastern United States will be felt widely because of low regional attenuation, the damage from shallow earthquakes induced by injection will be more localized to event epicenters than shaking tectonic earthquakes, which tend to be somewhat deeper. Within approximately 10 km of the epicenter, intensities are generally commensurate with predicted levels expected for the event magnitude.

  5. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  6. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    PubMed

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  7. Earthquakes in Alaska

    USGS Publications Warehouse

    Haeussler, Peter J.; Plafker, George

    1995-01-01

    Earthquake risk is high in much of the southern half of Alaska, but it is not the same everywhere. This map shows the overall geologic setting in Alaska that produces earthquakes. The Pacific plate (darker blue) is sliding northwestward past southeastern Alaska and then dives beneath the North American plate (light blue, green, and brown) in southern Alaska, the Alaska Peninsula, and the Aleutian Islands. Most earthquakes are produced where these two plates come into contact and slide past each other. Major earthquakes also occur throughout much of interior Alaska as a result of collision of a piece of crust with the southern margin.

  8. Depth dependence of earthquake frequency-magnitude distributions in California: Implications for rupture initiation

    USGS Publications Warehouse

    Mori, J.; Abercrombie, R.E.

    1997-01-01

    Statistics of earthquakes in California show linear frequency-magnitude relationships in the range of M2.0 to M5.5 for various data sets. Assuming Gutenberg-Richter distributions, there is a systematic decrease in b value with increasing depth of earthquakes. We find consistent results for various data sets from northern and southern California that both include and exclude the larger aftershock sequences. We suggest that at shallow depth (???0 to 6 km) conditions with more heterogeneous material properties and lower lithospheric stress prevail. Rupture initiations are more likely to stop before growing into large earthquakes, producing relatively more smaller earthquakes and consequently higher b values. These ideas help to explain the depth-dependent observations of foreshocks in the western United States. The higher occurrence rate of foreshocks preceding shallow earthquakes can be interpreted in terms of rupture initiations that are stopped before growing into the mainshock. At greater depth (9-15 km), any rupture initiation is more likely to continue growing into a larger event, so there are fewer foreshocks. If one assumes that frequency-magnitude statistics can be used to estimate probabilities of a small rupture initiation growing into a larger earthquake, then a small (M2) rupture initiation at 9 to 12 km depth is 18 times more likely to grow into a M5.5 or larger event, compared to the same small rupture initiation at 0 to 3 km. Copyright 1997 by the American Geophysical Union.

  9. Research in seismology and earthquake engineering in Venezuela

    USGS Publications Warehouse

    Urbina, L.; Grases, J.

    1983-01-01

    After the July 29, 1967, damaging earthquake (with a moderate magnitude of 6.3) caused widespread damage to the northern coastal area of Venezuela and to the Caracas Valley, the Venezuelan Government decided to establish a Presidential Earthquake Commission. This commission undertook the task of coordinating the efforts to study the after-effects of the earthquake. The July 1967 earthquake claimed numerous lives and caused extensive damage to the capital of Venezuela. In 1968, the U.S Geological Survey conducted a seismological field study in the northern coastal area and in the Caracas Valley of Venezuela. the objective was to study the area that sustained severe, moderate, and no damage to structures. A reported entitled Ground Amplification Studies in Earthquake Damage Areas: The Caracas Earthquake of 1967 documented, for the first time, short-period seismic wave ground-motion amplifications in the Caracas Valley. Figure 1 shows the area of severe damage in the Los Palos Grantes suburb and the correlation with depth of alluvium and the arabic numbers denote the ground amplification factor at each site in the area. the Venezuelan Government initiated many programs to study in detail the damage sustained and to investigate the ongoing construction practices. These actions motivated professionals in the academic, private, and Government sectors to develops further capabilities and self-sufficiency in the fields of engineering and seismology. Allocation of funds was made to assist in training professionals and technicians and in developing new seismological stations and new programs at the national level in earthquake engineering and seismology. A brief description of the ongoing programs in Venezuela is listed below. these programs are being performed by FUNVISIS and by other national organizations listed at the end of this article.   

  10. Earthquake prognosis:cause for failure and ways for the problem solution

    NASA Astrophysics Data System (ADS)

    Kondratiev, O.

    2003-04-01

    Despite of the more than 50-years history of the development of the prognosis earthquake method this problem is yet not to be resolved. This makes one to have doubt in rightness of the chosen approaches retrospective search of the diverse earthquake precursors. It is obvious to speak of long-term, middle-term and short-term earthquake prognosis. They all have a probabilistic character and it would be more correct to consider them as related to the seismic hazard prognosis. In distinction of them, the problem of the operative prognosis is being discussed in report. The operative prognosis should conclude the opportune presenting of the seismic alarm signal of the place, time and power of the earthquake in order to take necessary measures for maximal mitigation of the catastrophic consequence of this event. To do this it is necessary to predict the earthquake location with accuracy of first dozens of kilometres, time of its occurrence with accuracy of the first days and its power with accuracy of the magnitude units. If the problem is formulated in such a way, it cannot principally be resolved in the framework of the concept of the indirect earthquake precursors using. It is necessary to pass from the concept of the passive observatory network to the concept of the object-oriented search of the potential source zones and direct information obtaining on the parameter medium changes within these zones in the process of the earthquake preparation and development. While formulated in this way, the problem becomes a integrated task for the planet and prospecting geophysics. To detect the source zones it is possible to use the method of the converted waves of earthquakes, for monitoring - seismic reflecting and method of the common point. Arrangement of these and possible other geophysical methods should be provided by organising the special integrated geophysic expedition of the rapid response on the occurred strong earthquakes and conducting purposeful investigation

  11. Advanced Simulation of Coupled Earthquake and Tsunami Events

    NASA Astrophysics Data System (ADS)

    Behrens, Joern

    2013-04-01

    Tsunami-Earthquakes represent natural catastrophes threatening lives and well-being of societies in a solitary and unexpected extreme event as tragically demonstrated in Sumatra (2004), Samoa (2009), Chile (2010), or Japan (2011). Both phenomena are consequences of the complex system of interactions of tectonic stress, fracture mechanics, rock friction, rupture dynamics, fault geometry, ocean bathymetry, and coastline geometry. The ASCETE project forms an interdisciplinary research consortium that couples the most advanced simulation technologies for earthquake rupture dynamics and tsunami propagation to understand the fundamental conditions of tsunami generation. We report on the latest research results in physics-based dynamic rupture and tsunami wave propagation simulation, using unstructured and adaptive meshes with continuous and discontinuous Galerkin discretization approaches. Coupling both simulation tools - the physics-based dynamic rupture simulation and the hydrodynamic tsunami wave propagation - will give us the possibility to conduct highly realistic studies of the interaction of rupture dynamics and tsunami impact characteristics.

  12. Initiatives to Reduce Earthquake Risk of Developing Countries

    NASA Astrophysics Data System (ADS)

    Tucker, B. E.

    2008-12-01

    The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of

  13. The 2014 Weld County, Colorado, Earthquakes: A developing case of induced seismicity?

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Sheehan, A. F.; Weingarten, M.; Nakai, J.

    2014-12-01

    On June 1st 2014 (UTC), a M 3.2 earthquake occurred in Weld County, Colorado. Weld County is largely aseismic and this event was unexpected. There were no events in the ANSS Comprehensive Catalogue within 50 km of the earthquake. Weld County is the center of active oil and gas development, including many high-volume Class II wastewater injection wells. Injection wells have been linked to an increasing number of earthquakes throughout the US in recent years. Due to the lack of background seismicity in the area and the proximity of the earthquake to several injection wells, the University of Colorado requested seismometers from IRIS PASSCAL's Rapid Array Mobilization Program in order to study further seismicity. Seismic stations were deployed within 3 days of the June 1st event. We report on our ongoing findings from this deployment. To date, we have located 89 earthquakes and have detected over 600. These numbers continue to grow as we collect and analyze further data. Earthquake magnitudes remain small with only three earthquakes above M 2.0 recorded by our network, the largest of which was an M 2.6 earthquake on June 23rd 2014. Earthquakes locate in a small cluster (~2 km radius) centered near a high-volume injection well. This well operated at injection rates greater than 300,000 barrels/month since August 2013 and injects at a depth near the sediment-basement contact. Prior to our local deployment, the closest seismic station to the June 1st event was > 100 km away and therefore the evolution of seismicity prior the June 1st earthquake is poorly constrained. In order to better understand the temporal evolution of seismicity, we use match-filtering on data from these distant stations and find the earliest matching event on November 11th 2013. Due to the strong spatial and temporal correlation between these events and injection operations, we find it likely that these earthquakes are induced. In response to the ongoing seismicity near the well, the Colorado Oil

  14. Comparative study of earthquake-related and non-earthquake-related head traumas using multidetector computed tomography

    PubMed Central

    Chu, Zhi-gang; Yang, Zhi-gang; Dong, Zhi-hui; Chen, Tian-wu; Zhu, Zhi-yu; Shao, Heng

    2011-01-01

    OBJECTIVE: The features of earthquake-related head injuries may be different from those of injuries obtained in daily life because of differences in circumstances. We aim to compare the features of head traumas caused by the Sichuan earthquake with those of other common head traumas using multidetector computed tomography. METHODS: In total, 221 patients with earthquake-related head traumas (the earthquake group) and 221 patients with other common head traumas (the non-earthquake group) were enrolled in our study, and their computed tomographic findings were compared. We focused the differences between fractures and intracranial injuries and the relationships between extracranial and intracranial injuries. RESULTS: More earthquake-related cases had only extracranial soft tissue injuries (50.7% vs. 26.2%, RR = 1.9), and fewer cases had intracranial injuries (17.2% vs. 50.7%, RR = 0.3) compared with the non-earthquake group. For patients with fractures and intracranial injuries, there were fewer cases with craniocerebral injuries in the earthquake group (60.6% vs. 77.9%, RR = 0.8), and the earthquake-injured patients had fewer fractures and intracranial injuries overall (1.5±0.9 vs. 2.5±1.8; 1.3±0.5 vs. 2.1±1.1). Compared with the non-earthquake group, the incidences of soft tissue injuries and cranial fractures combined with intracranial injuries in the earthquake group were significantly lower (9.8% vs. 43.7%, RR = 0.2; 35.1% vs. 82.2%, RR = 0.4). CONCLUSION: As depicted with computed tomography, the severity of earthquake-related head traumas in survivors was milder, and isolated extracranial injuries were more common in earthquake-related head traumas than in non-earthquake-related injuries, which may have been the result of different injury causes, mechanisms and settings. PMID:22012045

  15. Landslides triggered by the Minxian-Zhangxian, China, Mw 5.9 earthquake of 22 July 2013

    NASA Astrophysics Data System (ADS)

    Xu, Chong; Xu, Xiwei; Shyu, J. Bruce H.

    2014-05-01

    On July 22, 2013, an earthquake of Ms 6.6 occurred at the junction area of Minxian and Zhangxian counties, Gansu Province, China. This earthquake triggered many landslides of various types, dominated by small-scale soil falls, slides, and topples on loess scarps. There were also some deep-seated landslides, large-scale soil avalanches, and fissure-developing slopes. In this paper, an inventory of landslides triggered by this event is prepared based on field investigations and visual interpretation of high-resolution satellite images. The spatial distribution of the landslides is then analyzed. The inventory indicates that at least 2,330 landslides were triggered by the earthquake. A correlation statistics of the landslides with topographic, geologic, and earthquake factors is performed based on the GIS platform. The results show that the largest number of landslides and the highest landslide density are at 2,400m-2,600m of absolute elevation, and 200m-300m of relative elevation, respectively. The landslide density does not always increase with slope gradient as previously suggested. The slopes most prone to landslides are in S, SW, W, and NW directions. Concave slopes register higher landslide density and larger number of landslides than convex slopes. The largest number of landslides occurs on topographic position with middle slopes, whereas the highest landslide density corresponds to valleys and lower slopes. The underlying bedrocks consisting of conglomerate and sandstone of Lower Paleogene (Eb) register both the largest number of landslides and the highest landslide density value. There is no clear relationship between PGA and the co-seismic landslides. Correlations of landslide number and landslide density with perpendicular- and along-strike distance from the epicenter show an obvious spatial intensifying character of the co-seismic landslides. The spatial pattern of the co-seismic landslides is strongly controlled by a branch of the Lintan-Dangchang fault

  16. Earthquakes, May-June 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage. 

  17. New information on earthquake history of the Aksehir-Afyon Graben System, Turkey, since the second half of 18th century

    NASA Astrophysics Data System (ADS)

    Ozer, N.

    2006-12-01

    Researches aimed at enriching the number of available documentary sources on earthquakes have an important role in seismology. To this end, this paper documents the history of prominent earthquakes associated with the NW-SE trending Sultandag-Aksehir Fault and Aksehir-Afyon graben system in Western-Central Anatolia since the historical times through 1766. This work also combines the earthquake data for both historical and instrumental periods, previously listed in various catalogues and resources, for the studied area. Documents from the Ottoman archives and libraries as well as the Ottoman and Turkish newspapers were scrutinized, and eight previously unreported earthquakes in the latter half of the nineteenth century and four new earthquakes in the period 1900-1931 were revealed. For the period from 1766 to 1931, the total number of known earthquakes for the area under investigation increased from eighteen to thirty thanks to the document search. Furthermore, the existing information on eleven previously reported earthquakes is updated for the period from 1862 to 1946. Earthquakes from 1946 to 1964 are compiled from the catalogues for data completeness.

  18. Earthquake forecasting studies using radon time series data in Taiwan

    NASA Astrophysics Data System (ADS)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  19. Polarized Politics and Policy Consequences

    DTIC Science & Technology

    2007-01-01

    consequences with regard to policymaking process and outcomes . Alongside the study of consequences , more research is needed regard- ing institutional reforms...research organization providing objective analysis and effective solutions that address the challenges facing the public and private sectors...SUBTITLE Polarized politics and policy consequences 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Diana Epstein

  20. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  1. Extreme magnitude earthquakes and their economical impact: The Mexico City case

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Mario, C.

    2005-12-01

    The consequences (estimated by the human and economical losses) of the recent occurrence (worldwide) of extreme magnitude (for the region under consideration) earthquakes, such as the 19 09 1985 in Mexico (Ritchter magnitude Ms 8.1, moment magnitude Mw 8.01), or the one in Indonesia of the 26 12 2004 (Ms 9.4, Mw 9.3), stress the importance of performing seismic hazard analysis that, specifically, incorporate this possibility. Herewith, we present and apply a methodology, based on plausible extreme seismic scenarios and the computation of their associated synthetic accelerograms, to estimate the seismic hazard on Mexico City (MC) stiff and compressible surficial soils. The uncertainties about the characteristics of the potential finite seismic sources, as well as those related to the dynamic properties of MC compressible soils are taken into account. The economic consequences (i.e. the seismic risk = seismic hazard x economic cost) implicit in the seismic coefficients proposed in MC seismic Codes before (1976) and after the 1985 earthquake (2004) are analyzed. Based on the latter and on an acceptable risk criterion, a maximum seismic coefficient (MSC) of 1.4g (g = 9.81m/s2) of the elastic acceleration design spectra (5 percent damping), which has a probability of exceedance of 2.4 x 10-4, seems to be appropriate for analyzing the seismic behavior of infrastructure located on MC compressible soils, if extreme Mw 8.5 subduction thrust mechanism earthquakes (similar to the one occurred on 19 09 1985 with an observed, equivalent, MSC of 1g) occurred in the next 50 years.

  2. Site Response for Micro-Zonation from Small Earthquakes

    NASA Astrophysics Data System (ADS)

    Gospe, T. B.; Hutchings, L.; Liou, I. Y. W.; Jarpe, S.

    2017-12-01

    /V spectral ratios of noise don't provide accurate site response estimates either. Vs30 only provides one amplification number and doesn't account for the variable three-dimensional structure beneath sites. We conclude that absolute site response obtained directly from earthquakes is the best, and possibly, the only way to get accurate site response estimates.

  3. The Loma Prieta, California, earthquake of October 17, 1989 - Public response: Chapter B in The Loma Prieta, California, earthquake of October 17, 1989: Societal Response (Professional Paper 1553)

    USGS Publications Warehouse

    Bolton, Patricia A.

    1993-01-01

    Major earthquakes provide seismologists and engineers an opportunity to examine the performance of the Earth and the man-made structures in response to the forces of the quake. So, too, do they provide social scientists an opportunity to delve into human responses evoked by the ground shaking and its physical consequences. The findings from such research can serve to guide the development and application of programs and practices designed to reduce death, injury, property losses, and social disruption in subsequent earthquakes. This chapter contains findings from studies focused mainly on public response to the Loma Prieta earthquake; that is, on the behavior and perceptions of the general population rather than on the activities of specific organizations or on the impact on procedures or policies. A major feature of several of these studies is that the information was collected from the population throughout the Bay area, not just from persons in the most badly damaged communities or who had suffered the greatest losses. This wide range serves to provide comparisons of behavior for those most directly affected by the earthquake with others who were less directly affected by it but still had to consider it very “close to home.”

  4. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    NASA Astrophysics Data System (ADS)

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  5. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  6. [The number of deaths by suicide after the Great East Japan Earthquake based on demographic statistics in the coastal and non-coastal areas of Iwate, Miyagi, and Fukushima prefectures].

    PubMed

    Masaki, Naoko; Hashimoto, Shuji; Kawado, Miyuki; Ojima, Toshiyuki; Takeshima, Tadashi; Matsubara, Miyuki; Mitoku, Kazuko; Ogata, Yukiko

    2018-01-01

    Objective The number of deaths by suicide after the Great East Japan Earthquake was surveyed based on demographic statistics. In particular, this study examined whether or not there were excessive deaths due to suicide (excluding people who were injured in the earthquake) after the Great East Japan Earthquake disaster. This examination surveyed municipalities in coastal and non-coastal areas of Iwate, Miyagi, and Fukushima prefectures (referred to below as the "three prefectures").Methods The demographic statistics questionnaire survey information supplied by Article 33 of the Statistics Act (Ministry of Health, Labour and Welfare's published statistics Vol. 0925 No.4, September 25 th , 2014) were used as the basic data with particular reference to the information on the deaths from January 1 st , 2010 to March 31 st , 2013. The information obtained included the date of death, the municipality where the address of the deceased was registered, the gender of the deceased, age at the time of death, and cause of death codes (International Classification of Disease Codes 10 th revision: ICD-10). Additionally, information was gathered about the population based on the resident register from 2009 to 2013 and the 2010 National Census; the number of deaths by suicide was then totalled by period and area. The areas were classified as municipalities within three prefectures and those located elsewhere using the municipality where the address of the deceased was registered.Results The SMR for suicides did not show a tendency to increase for coastal or non-coastal areas throughout the two-year period after the earthquake disaster (from March 2011 to February 2013). The SMR for the three prefectures 0-1 years after the disaster compared with the year before the disaster was 0.92 and for 1-2 years after the disaster was 0.93. Both these values were significantly low. Looking at both the non-coastal and coastal areas from each of the three prefectures, the SMR for suicides

  7. Are Earthquakes Predictable? A Study on Magnitude Correlations in Earthquake Catalog and Experimental Data

    NASA Astrophysics Data System (ADS)

    Stavrianaki, K.; Ross, G.; Sammonds, P. R.

    2015-12-01

    The clustering of earthquakes in time and space is widely accepted, however the existence of correlations in earthquake magnitudes is more questionable. In standard models of seismic activity, it is usually assumed that magnitudes are independent and therefore in principle unpredictable. Our work seeks to test this assumption by analysing magnitude correlation between earthquakes and their aftershocks. To separate mainshocks from aftershocks, we perform stochastic declustering based on the widely used Epidemic Type Aftershock Sequence (ETAS) model, which allows us to then compare the average magnitudes of aftershock sequences to that of their mainshock. The results of earthquake magnitude correlations were compared with acoustic emissions (AE) from laboratory analog experiments, as fracturing generates both AE at the laboratory scale and earthquakes on a crustal scale. Constant stress and constant strain rate experiments were done on Darley Dale sandstone under confining pressure to simulate depth of burial. Microcracking activity inside the rock volume was analyzed by the AE technique as a proxy for earthquakes. Applying the ETAS model to experimental data allowed us to validate our results and provide for the first time a holistic view on the correlation of earthquake magnitudes. Additionally we search the relationship between the conditional intensity estimates of the ETAS model and the earthquake magnitudes. A positive relation would suggest the existence of magnitude correlations. The aim of this study is to observe any trends of dependency between the magnitudes of aftershock earthquakes and the earthquakes that trigger them.

  8. Putting down roots in earthquake country-Your handbook for earthquakes in the Central United States

    USGS Publications Warehouse

    Contributors: Dart, Richard; McCarthy, Jill; McCallister, Natasha; Williams, Robert A.

    2011-01-01

    This handbook provides information to residents of the Central United States about the threat of earthquakes in that area, particularly along the New Madrid seismic zone, and explains how to prepare for, survive, and recover from such events. It explains the need for concern about earthquakes for those residents and describes what one can expect during and after an earthquake. Much is known about the threat of earthquakes in the Central United States, including where they are likely to occur and what can be done to reduce losses from future earthquakes, but not enough has been done to prepare for future earthquakes. The handbook describes such preparations that can be taken by individual residents before an earthquake to be safe and protect property.

  9. Important Earthquake Engineering Resources

    Science.gov Websites

    PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering

  10. Repeating Earthquakes Following an Mw 4.4 Earthquake Near Luther, Oklahoma

    NASA Astrophysics Data System (ADS)

    Clements, T.; Keranen, K. M.; Savage, H. M.

    2015-12-01

    An Mw 4.4 earthquake on April 16, 2013 near Luther, OK was one of the earliest M4+ earthquakes in central Oklahoma, following the Prague sequence in 2011. A network of four local broadband seismometers deployed within a day of the Mw 4.4 event, along with six Oklahoma netquake stations, recorded more than 500 aftershocks in the two weeks following the Luther earthquake. Here we use HypoDD (Waldhauser & Ellsworth, 2000) and waveform cross-correlation to obtain precise aftershock locations. The location uncertainty, calculated using the SVD method in HypoDD, is ~15 m horizontally and ~ 35 m vertically. The earthquakes define a near vertical, NE-SW striking fault plane. Events occur at depths from 2 km to 3.5 km within the granitic basement, with a small fraction of events shallower, near the sediment-basement interface. Earthquakes occur within a zone of ~200 meters thickness on either side of the best-fitting fault surface. We use an equivalency class algorithm to identity clusters of repeating events, defined as event pairs with median three-component correlation > 0.97 across common stations (Aster & Scott, 1993). Repeating events occur as doublets of only two events in over 50% of cases; overall, 41% of earthquakes recorded occur as repeating events. The recurrence intervals for the repeating events range from minutes to days, with common recurrence intervals of less than two minutes. While clusters occur in tight dimensions, commonly of 80 m x 200 m, aftershocks occur in 3 distinct ~2km x 2km-sized patches along the fault. Our analysis suggests that with rapidly deployed local arrays, the plethora of ~Mw 4 earthquakes occurring in Oklahoma and Southern Kansas can be used to investigate the earthquake rupture process and the role of damage zones.

  11. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  12. Coping with the challenges of early disaster response: 24 years of field hospital experience after earthquakes.

    PubMed

    Bar-On, Elhanan; Abargel, Avi; Peleg, Kobi; Kreiss, Yitshak

    2013-10-01

    To propose strategies and recommendations for future planning and deployment of field hospitals after earthquakes by comparing the experience of 4 field hospitals deployed by The Israel Defense Forces (IDF) Medical Corps in Armenia, Turkey, India and Haiti. Quantitative data regarding the earthquakes were collected from published sources; data regarding hospital activity were collected from IDF records; and qualitative information was obtained from structured interviews with key figures involved in the missions. The hospitals started operating between 89 and 262 hours after the earthquakes. Their sizes ranged from 25 to 72 beds, and their personnel numbered between 34 and 100. The number of patients treated varied from 1111 to 2400. The proportion of earthquake-related diagnoses ranged from 28% to 67% (P < .001), with hospitalization rates between 3% and 66% (P < .001) and surgical rates from 1% to 24% (P < .001). In spite of characteristic scenarios and injury patterns after earthquakes, patient caseload and treatment requirements varied widely. The variables affecting the patient profile most significantly were time until deployment, total number of injured, availability of adjacent medical facilities, and possibility of evacuation from the disaster area. When deploying a field hospital in the early phase after an earthquake, a wide variability in patient caseload should be anticipated. Customization is difficult due to the paucity of information. Therefore, early deployment necessitates full logistic self-sufficiency and operational versatility. Also, collaboration with local and international medical teams can greatly enhance treatment capabilities.

  13. Using earthquake clusters to identify fracture zones at Puna geothermal field, Hawaii

    NASA Astrophysics Data System (ADS)

    Lucas, A.; Shalev, E.; Malin, P.; Kenedi, C. L.

    2010-12-01

    The actively producing Puna geothermal system (PGS) is located on the Kilauea East Rift Zone (ERZ), which extends out from the active Kilauea volcano on Hawaii. In the Puna area the rift trend is identified as NE-SW from surface expressions of normal faulting with a corresponding strike; at PGS the surface expression offsets in a left step, but no rift perpendicular faulting is observed. An eight station borehole seismic network has been installed in the area of the geothermal system. Since June 2006, a total of 6162 earthquakes have been located close to or inside the geothermal system. The spread of earthquake locations follows the rift trend, but down rift to the NE of PGS almost no earthquakes are observed. Most earthquakes located within the PGS range between 2-3 km depth. Up rift to the SW of PGS the number of events decreases and the depth range increases to 3-4 km. All initial locations used Hypoinverse71 and showed no trends other than the dominant rift parallel. Double difference relocation of all earthquakes, using both catalog and cross-correlation, identified one large cluster but could not conclusively identify trends within the cluster. A large number of earthquake waveforms showed identifiable shear wave splitting. For five stations out of the six where shear wave splitting was observed, the dominant polarization direction was rift parallel. Two of the five stations also showed a smaller rift perpendicular signal. The sixth station (located close to the area of the rift offset) displayed a N-S polarization, approximately halfway between rift parallel and perpendicular. The shear wave splitting time delays indicate that fracture density is higher at the PGS compared to the surrounding ERZ. Correlation co-efficient clustering with independent P and S wave windows was used to identify clusters based on similar earthquake waveforms. In total, 40 localized clusters containing ten or more events were identified. The largest cluster was located in the

  14. ViscoSim Earthquake Simulator

    USGS Publications Warehouse

    Pollitz, Fred

    2012-01-01

    Synthetic seismicity simulations have been explored by the Southern California Earthquake Center (SCEC) Earthquake Simulators Group in order to guide long‐term forecasting efforts related to the Unified California Earthquake Rupture Forecast (Tullis et al., 2012a). In this study I describe the viscoelastic earthquake simulator (ViscoSim) of Pollitz, 2009. Recapitulating to a large extent material previously presented by Pollitz (2009, 2011) I describe its implementation of synthetic ruptures and how it differs from other simulators being used by the group.

  15. The predictors of earthquake preparedness in Tehran households

    PubMed Central

    Ranjbar, Maryam; Soleimani, Ali Akbar; Shahboulaghi, Farahnaz Mohammadi; Paton, Douglas; Noroozi, Mehdi

    2018-01-01

    Background The high risk of an earthquake happening and the harmful consequences that it leaves, besides the unsuccessful policies for preparing the community for mitigation, suggested that social factors should be considered more in this regard. Social trust is an influencing factor that can have significant impact on people’s behavior. Objective To determine the relationship of the influencing factors on the preparedness of Tehran households against earthquake. Methods This was a cross-sectional study with 369 participants (February to April 2017) involved through stratified random sampling from selected urban districts of Tehran. The Persian version of an ‘Intention to be prepared’ measurement tool and a standard checklist of earthquake preparedness behaviors were used. The tool was evaluated for internal consistency and test-retest reliability in a pilot study (Cronbach’s α =0.94 and Intra Class Correlation Coefficient =0.92). Results Multivariate linear regression analysis showed that social trust is the most important predictor for the preparedness mean of changes in Tehran (R2=0.109, p<0.001, β: 0.187 for the Preparedness behavior; R2=0.117, β: 0.298, p<0.001 for Intention to be prepared; and R2=0.142, β: 0.345, p<0.001 for the Perceived preparedness). Conclusion The relationship between social trust and preparedness dimensions suggested that changing a social behavior is not possible through considering only individual characteristics of community members and not their social networks relations. The programs and policies which try to enhance the social trust in general, may be able to increase public preparedness against earthquakes in the future. PMID:29765572

  16. Earthquakes, March-April, 1993

    USGS Publications Warehouse

    Person, Waverly J.

    1993-01-01

    Worldwide, only one major earthquake (7.0earthquake, a magnitude 7.2 shock, struck the Santa Cruz Islands region in the South Pacific on March 6. Earthquake-related deaths occurred in the Fiji Islands, China, and Peru.

  17. 2016 National Earthquake Conference

    Science.gov Websites

    Thank you to our Presenting Sponsor, California Earthquake Authority. What's New? What's Next ? What's Your Role in Building a National Strategy? The National Earthquake Conference (NEC) is a , state government leaders, social science practitioners, U.S. State and Territorial Earthquake Managers

  18. The music of earthquakes and Earthquake Quartet #1

    USGS Publications Warehouse

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  19. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  20. Geophysical Anomalies and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  1. Historical earthquake research in Austria

    NASA Astrophysics Data System (ADS)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  2. Unraveling earthquake stresses: Insights from dynamically triggered and induced earthquakes

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Alfaro-Diaz, R. A.

    2017-12-01

    Induced seismicity, earthquakes caused by anthropogenic activity, has more than doubled in the last several years resulting from practices related to oil and gas production. Furthermore, large earthquakes have been shown to promote the triggering of other events within two fault lengths (static triggering), due to static stresses caused by physical movement along the fault, and also remotely from the passage of seismic waves (dynamic triggering). Thus, in order to understand the mechanisms for earthquake failure, we investigate regions where natural, induced, and dynamically triggered events occur, and specifically target Oklahoma. We first analyze data from EarthScope's USArray Transportable Array (TA) and local seismic networks implementing an optimized (STA/LTA) detector in order to develop local detection and earthquake catalogs. After we identify triggered events through statistical analysis, and perform a stress analysis to gain insight on the stress-states leading to triggered earthquake failure. We use our observations to determine the role of different transient stresses in contributing to natural and induced seismicity by comparing these stresses to regional stress orientation. We also delineate critically stressed regions of triggered seismicity that may indicate areas susceptible to earthquake hazards associated with sustained fluid injection in provinces of induced seismicity. Anthropogenic injection and extraction activity can alter the stress state and fluid flow within production basins. By analyzing the stress release of these ancient faults caused by dynamic stresses, we may be able to determine if fluids are solely responsible for increased seismic activity in induced regions.

  3. VLF/LF Amplitude Perturbations before Tuscany Earthquakes, 2013

    NASA Astrophysics Data System (ADS)

    Khadka, Balaram; Kandel, Keshav Prasad; Pant, Sudikshya; Bhatta, Karan; Ghimire, Basu Dev

    2017-12-01

    The US Navy VLF/LF Transmitter's NSY signal (45.9 kHz) transmitted from Niscemi, Sicily, Italy, and received at the Kiel Long Wave Monitor, Germany, was analyzed for the period of two months, May and June (EQ-month) of 2013. There were 12 earthquakes of magnitude greater than 4 that hit Italy in these two months, of which the earthquake of 21st June having magnitude of 5.2 and a shallow focal depth of 5 km was the major one. We studied the earthquake of 21st of June 2013, which struck Tuscany, Central Italy, (44.1713°N and 10.2082°E) at 10:33 UT, and also analyzed the effects of this earthquake on the sub-ionos- pheric VLF/LF signals. In addition, we also studied another earthquake, of magnitude 4.9, which hit the same place at 14:40 UT on 30th of June and had shallow focal depth of 10 km. We assessed the data using terminator time (TT) method and night time fluctuation method and found unusual changes in VLF/LF amplitudes/phases. Analysis of trend, night time dispers! ion, and night time fluctuation was also carried and several anomalies were detected. Most ionospheric perturbations in these parameters were found in the month of June, from few days to few weeks prior to the earthquakes. Moreover, we filtered the possible effects due to geomagnetic storms, auroras, and solar activities using parameters like Dst index, AE index, and Kp index for analyzing the geomagnetic effects, and Bz (sigma) index, sunspot numbers, and solar index F10.7 for analyzing the solar activities for the confirmation of anomalies as precursors.

  4. Operational Earthquake Forecasting of Aftershocks for New England

    NASA Astrophysics Data System (ADS)

    Ebel, J.; Fadugba, O. I.

    2015-12-01

    Although the forecasting of mainshocks is not possible, recent research demonstrates that probabilistic forecasts of expected aftershock activity following moderate and strong earthquakes is possible. Previous work has shown that aftershock sequences in intraplate regions behave similarly to those in California, and thus the operational aftershocks forecasting methods that are currently employed in California can be adopted for use in areas of the eastern U.S. such as New England. In our application, immediately after a felt earthquake in New England, a forecast of expected aftershock activity for the next 7 days will be generated based on a generic aftershock activity model. Approximately 24 hours after the mainshock, the parameters of the aftershock model will be updated using the observed aftershock activity observed to that point in time, and a new forecast of expected aftershock activity for the next 7 days will be issued. The forecast will estimate the average number of weak, felt aftershocks and the average expected number of aftershocks based on the aftershock statistics of past New England earthquakes. The forecast also will estimate the probability that an earthquake that is stronger than the mainshock will take place during the next 7 days. The aftershock forecast will specify the expected aftershocks locations as well as the areas over which aftershocks of different magnitudes could be felt. The system will use web pages, email and text messages to distribute the aftershock forecasts. For protracted aftershock sequences, new forecasts will be issued on a regular basis, such as weekly. Initially, the distribution system of the aftershock forecasts will be limited, but later it will be expanded as experience with and confidence in the system grows.

  5. Earthquakes in Oita triggered by the 2016 M7.3 Kumamoto earthquake

    NASA Astrophysics Data System (ADS)

    Yoshida, Shingo

    2016-11-01

    During the passage of the seismic waves from the M7.3 Kumamoto, Kyushu, earthquake on April 16, 2016, a M5.7 [semiofficial value estimated by the Japan Meteorological Agency (JMA)] event occurred in the central part of Oita prefecture, approximately 80 km far away from the mainshock. Although there have been a number of reports that M < 5 earthquakes were remotely triggered during the passage of seismic waves from mainshocks, there has been no evidence for M > 5 triggered events. In this paper, we firstly confirm that this event is a M6-class event by re-estimating the magnitude using the strong-motion records of K-NET and KiK-net, and crustal deformation data at the Yufuin station observed by the Geospatial Information Authority of Japan. Next, by investigating the aftershocks of 45 mainshocks which occurred over the past 20 years based on the JMA earthquake catalog (JMAEC), we found that the delay time of the 2016 M5.7 event in Oita was the shortest. Therefore, the M5.7 event could be regarded as an exceptional M > 5 event that was triggered by passing seismic waves, unlike the usual triggered events and aftershocks. Moreover, a search of the JMAEC shows that in the 2016 Oita aftershock area, swarm earthquake activity was low over the past 30 years compared with neighboring areas. We also found that in the past, probably or possibly triggered events frequently occurred in the 2016 Oita aftershock area. The Oita area readily responds to remote triggering because of high geothermal activity and young volcanism in the area. The M5.7 Oita event was triggered by passing seismic waves, probably because large dynamic stress change was generated by the mainshock at a short distance and because the Oita area was already loaded to a critical stress state without a recent energy release as suggested by the past low swarm activity.[Figure not available: see fulltext.

  6. Geological evidence for Holocene earthquakes and tsunamis along the Nankai-Suruga Trough, Japan

    NASA Astrophysics Data System (ADS)

    Garrett, Ed; Fujiwara, Osamu; Garrett, Philip; Heyvaert, Vanessa M. A.; Shishikura, Masanobu; Yokoyama, Yusuke; Hubert-Ferrari, Aurélia; Brückner, Helmut; Nakamura, Atsunori; De Batist, Marc

    2016-04-01

    The Nankai-Suruga Trough, lying immediately south of Japan's densely populated and highly industrialised southern coastline, generates devastating great earthquakes (magnitude > 8). Intense shaking, crustal deformation and tsunami generation accompany these ruptures. Forecasting the hazards associated with future earthquakes along this >700 km long fault requires a comprehensive understanding of past fault behaviour. While the region benefits from a long and detailed historical record, palaeoseismology has the potential to provide a longer-term perspective and additional insights. Here, we summarise the current state of knowledge regarding geological evidence for past earthquakes and tsunamis, incorporating literature originally published in both Japanese and English. This evidence comes from a wide variety of sources, including uplifted marine terraces and biota, marine and lacustrine turbidites, liquefaction features, subsided marshes and tsunami deposits in coastal lakes and lowlands. We enhance available results with new age modelling approaches. While publications describe proposed evidence from > 70 sites, only a limited number provide compelling, well-dated evidence. The best available records allow us to map the most likely rupture zones of eleven earthquakes occurring during the historical period. Our spatiotemporal compilation suggests the AD 1707 earthquake ruptured almost the full length of the subduction zone and that earthquakes in AD 1361 and 684 were predecessors of similar magnitude. Intervening earthquakes were of lesser magnitude, highlighting variability in rupture mode. Recurrence intervals for ruptures of the a single seismic segment range from less than 100 to more than 450 years during the historical period. Over longer timescales, palaeoseismic evidence suggests intervals ranging from 100 to 700 years. However, these figures reflect thresholds of evidence creation and preservation as well as genuine recurrence intervals. At present, we have

  7. Earthquake triggering by seismic waves following the landers and hector mine earthquakes

    USGS Publications Warehouse

    Gomberg, J.; Reasenberg, P.A.; Bodin, P.; Harris, R.A.

    2001-01-01

    The proximity and similarity of the 1992, magnitude 7.3 Landers and 1999, magnitude 7.1 Hector Mine earthquakes in California permit testing of earthquake triggering hypotheses not previously possible. The Hector Mine earthquake confirmed inferences that transient, oscillatory 'dynamic' deformations radiated as seismic waves can trigger seismicity rate increases, as proposed for the Landers earthquake1-6. Here we quantify the spatial and temporal patterns of the seismicity rate changes7. The seismicity rate increase was to the north for the Landers earthquake and primarily to the south for the Hector Mine earthquake. We suggest that rupture directivity results in elevated dynamic deformations north and south of the Landers and Hector Mine faults, respectively, as evident in the asymmetry of the recorded seismic velocity fields. Both dynamic and static stress changes seem important for triggering in the near field with dynamic stress changes dominating at greater distances. Peak seismic velocities recorded for each earthquake suggest the existence of, and place bounds on, dynamic triggering thresholds. These thresholds vary from a few tenths to a few MPa in most places, depend on local conditions, and exceed inferred static thresholds by more than an order of magnitude. At some sites, the onset of triggering was delayed until after the dynamic deformations subsided. Physical mechanisms consistent with all these observations may be similar to those that give rise to liquefaction or cyclic fatigue.

  8. Earthquakes and plague during Byzantine times: can lessons from the past improve epidemic preparedness.

    PubMed

    Tsiamis, Costas; Poulakou-Rebelakou, Effie; Marketos, Spyros

    2013-01-01

    Natural disasters have always been followed by a fear of infectious diseases. This raised historical debate about one of the most feared scenarios: the outbreak of bubonic plague caused by Yersinia pestis. One such event was recorded in the Indian state Maharashtra in 1994 after an earthquake. In multidisciplinary historical approach to the evolution of plague, many experts ignore the possibility of natural foci and their activation. This article presents historical records from the Byzantine Empire about outbreaks of the Plague of Justinian occurring months or even up to a year after high-magnitude earthquakes. Historical records of plague outbreaks can be used to document existence of natural foci all over the world. Knowledge of these historical records and the contemporary examples of plague support the assumption that, in terms of organising humanitarian aid, poor monitoring of natural foci could lead to unpredictable epidemiological consequences after high-magnitude earthquakes.

  9. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  10. Extending earthquakes' reach through cascading.

    PubMed

    Marsan, David; Lengliné, Olivier

    2008-02-22

    Earthquakes, whatever their size, can trigger other earthquakes. Mainshocks cause aftershocks to occur, which in turn activate their own local aftershock sequences, resulting in a cascade of triggering that extends the reach of the initial mainshock. A long-lasting difficulty is to determine which earthquakes are connected, either directly or indirectly. Here we show that this causal structure can be found probabilistically, with no a priori model nor parameterization. Large regional earthquakes are found to have a short direct influence in comparison to the overall aftershock sequence duration. Relative to these large mainshocks, small earthquakes collectively have a greater effect on triggering. Hence, cascade triggering is a key component in earthquake interactions.

  11. Earthquake Catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  12. Earthquakes, July-August 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  13. Facts about the Eastern Japan Great Earthquake of March 2011

    NASA Astrophysics Data System (ADS)

    Moriyama, T.

    2011-12-01

    The 2011 great earthquake was a magnitude 9.0 Mw undersea megathrust earthquake off the coast of Japan that occurred early morning UTC on Friday, 11 March 2011, with the epicenter approximately 70 kilometres east of the Oshika Peninsula of Tohoku and the hypocenter at an underwater depth of approximately 32 km. It was the most powerful known earthquake to have hit Japan, and one of the five most powerful earthquakes in the world overall since modern record keeping began in 1900. The earthquake triggered extremely destructive tsunami waves of up to 38.9 metres that struck Tohoku Japan, in some cases traveling up to 10 km inland. In addition to loss of life and destruction of infrastructure, the tsunami caused a number of nuclear accidents, primarily the ongoing level 7 meltdowns at three reactors in the Fukushima I Nuclear Power Plant complex, and the associated evacuation zones affecting hundreds of thousands of residents. The Japanese National Police Agency has confirmed 1,5457 deaths, 5,389 injured, and 7,676 people missing across eighteen prefectures, as well as over 125,000 buildings damaged or destroyed. JAXA carried out ALOS emergency observation just after the earthquake occured, and acquired more than 400 scenes over the disaster area. The coseismic interferogram by InSAR analysis cleary showing the epicenter of the earthquake and land surface deformation over Tohoku area. By comparison of before and after satellite images, the large scale damaged area by tunami are extracted. These images and data can access via JAXA website and also GEO Tohoku oki event supersite website.

  14. Prevention of strong earthquakes: Goal or utopia?

    NASA Astrophysics Data System (ADS)

    Mukhamediev, Sh. A.

    2010-11-01

    In the present paper, we consider ideas suggesting various kinds of industrial impact on the close-to-failure block of the Earth’s crust in order to break a pending strong earthquake (PSE) into a number of smaller quakes or aseismic slips. Among the published proposals on the prevention of a forthcoming strong earthquake, methods based on water injection and vibro influence merit greater attention as they are based on field observations and the results of laboratory tests. In spite of this, the cited proofs are, for various reasons, insufficient to acknowledge the proposed techniques as highly substantiated; in addition, the physical essence of these methods has still not been fully understood. First, the key concept of the methods, namely, the release of the accumulated stresses (or excessive elastic energy) in the source region of a forthcoming strong earthquake, is open to objection. If we treat an earthquake as a phenomenon of a loss in stability, then, the heterogeneities of the physicomechanical properties and stresses along the existing fault or its future trajectory, rather than the absolute values of stresses, play the most important role. In the present paper, this statement is illustrated by the classical examples of stable and unstable fractures and by the examples of the calculated stress fields, which were realized in the source regions of the tsunamigenic earthquakes of December 26, 2004 near the Sumatra Island and of September 29, 2009 near the Samoa Island. Here, just before the earthquakes, there were no excessive stresses in the source regions. Quite the opposite, the maximum shear stresses τmax were close to their minimum value, compared to τmax in the adjacent territory. In the present paper, we provide quantitative examples that falsify the theory of the prevention of PSE in its current form. It is shown that the measures for the prevention of PSE, even when successful for an already existing fault, can trigger or accelerate a catastrophic

  15. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  16. A new source process for evolving repetitious earthquakes at Ngauruhoe volcano, New Zealand

    NASA Astrophysics Data System (ADS)

    Jolly, A. D.; Neuberg, J.; Jousset, P.; Sherburn, S.

    2012-02-01

    Since early 2005, Ngauruhoe volcano has produced repeating low-frequency earthquakes with evolving waveforms and spectral features which become progressively enriched in higher frequency energy during the period 2005 to 2009, with the trend reversing after that time. The earthquakes also show a seasonal cycle since January 2006, with peak numbers of events occurring in the spring and summer period and lower numbers of events at other times. We explain these patterns by the excitation of a shallow two-phase water/gas or water/steam cavity having temporal variations in volume fraction of bubbles. Such variations in two-phase systems are known to produce a large range of acoustic velocities (2-300 m/s) and corresponding changes in impedance contrast. We suggest that an increasing bubble volume fraction is caused by progressive heating of melt water in the resonant cavity system which, in turn, promotes the scattering excitation of higher frequencies, explaining both spectral shift and seasonal dependence. We have conducted a constrained waveform inversion and grid search for moment, position and source geometry for the onset of two example earthquakes occurring 17 and 19 January 2008, a time when events showed a frequency enrichment episode occurring over a period of a few days. The inversion and associated error analysis, in conjunction with an earthquake phase analysis show that the two earthquakes represent an excitation of a single source position and geometry. The observed spectral changes from a stationary earthquake source and geometry suggest that an evolution in both near source resonance and scattering is occurring over periods from days to months.

  17. Effect of data quality on a hybrid Coulomb/STEP model for earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Steacy, Sandy; Jimenez, Abigail; Gerstenberger, Matt; Christophersen, Annemarie

    2014-05-01

    Operational earthquake forecasting is rapidly becoming a 'hot topic' as civil protection authorities seek quantitative information on likely near future earthquake distributions during seismic crises. At present, most of the models in public domain are statistical and use information about past and present seismicity as well as b-value and Omori's law to forecast future rates. A limited number of researchers, however, are developing hybrid models which add spatial constraints from Coulomb stress modeling to existing statistical approaches. Steacy et al. (2013), for instance, recently tested a model that combines Coulomb stress patterns with the STEP (short-term earthquake probability) approach against seismicity observed during the 2010-2012 Canterbury earthquake sequence. They found that the new model performed at least as well as, and often better than, STEP when tested against retrospective data but that STEP was generally better in pseudo-prospective tests that involved data actually available within the first 10 days of each event of interest. They suggested that the major reason for this discrepancy was uncertainty in the slip models and, in particular, in the geometries of the faults involved in each complex major event. Here we test this hypothesis by developing a number of retrospective forecasts for the Landers earthquake using hypothetical slip distributions developed by Steacy et al. (2004) to investigate the sensitivity of Coulomb stress models to fault geometry and earthquake slip. Specifically, we consider slip models based on the NEIC location, the CMT solution, surface rupture, and published inversions and find significant variation in the relative performance of the models depending upon the input data.

  18. Long-Delayed Aftershocks in New Zealand and the 2016 M7.8 Kaikoura Earthquake

    NASA Astrophysics Data System (ADS)

    Shebalin, P.; Baranov, S.

    2017-10-01

    We study aftershock sequences of six major earthquakes in New Zealand, including the 2016 M7.8 Kaikaoura and 2016 M7.1 North Island earthquakes. For Kaikaoura earthquake, we assess the expected number of long-delayed large aftershocks of M5+ and M5.5+ in two periods, 0.5 and 3 years after the main shocks, using 75 days of available data. We compare results with obtained for other sequences using same 75-days period. We estimate the errors by considering a set of magnitude thresholds and corresponding periods of data completeness and consistency. To avoid overestimation of the expected rates of large aftershocks, we presume a break of slope of the magnitude-frequency relation in the aftershock sequences, and compare two models, with and without the break of slope. Comparing estimations to the actual number of long-delayed large aftershocks, we observe, in general, a significant underestimation of their expected number. We can suppose that the long-delayed aftershocks may reflect larger-scale processes, including interaction of faults, that complement an isolated relaxation process. In the spirit of this hypothesis, we search for symptoms of the capacity of the aftershock zone to generate large events months after the major earthquake. We adapt an algorithm EAST, studying statistics of early aftershocks, to the case of secondary aftershocks within aftershock sequences of major earthquakes. In retrospective application to the considered cases, the algorithm demonstrates an ability to detect in advance long-delayed aftershocks both in time and space domains. Application of the EAST algorithm to the 2016 M7.8 Kaikoura earthquake zone indicates that the most likely area for a delayed aftershock of M5.5+ or M6+ is at the northern end of the zone in Cook Strait.

  19. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  20. Earthquakes, September-October 1993

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    The fatalities in the United States were caused by two earthquakes in southern Oregon on September 21. These earthquakes, both with magnitude 6.0 and separated in time by about 2 hrs, led to the deaths of two people. One of these deaths was apparently due to a heart attack induced by the earthquake

  1. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  2. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    NASA Astrophysics Data System (ADS)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  3. Accounting for orphaned aftershocks in the earthquake background rate

    USGS Publications Warehouse

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  4. Accounting for orphaned aftershocks in the earthquake background rate

    NASA Astrophysics Data System (ADS)

    van der Elst, Nicholas J.

    2017-11-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  5. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  6. Local earthquake tomography of Scotland

    NASA Astrophysics Data System (ADS)

    Luckett, Richard; Baptie, Brian

    2015-03-01

    Scotland is a relatively aseismic region for the use of local earthquake tomography, but 40 yr of earthquakes recorded by a good and growing network make it possible. A careful selection is made from the earthquakes located by the British Geological Survey (BGS) over the last four decades to provide a data set maximising arrival time accuracy and ray path coverage of Scotland. A large number of 1-D velocity models with different layer geometries are considered and differentiated by employing quarry blasts as ground-truth events. Then, SIMULPS14 is used to produce a robust 3-D tomographic P-wave velocity model for Scotland. In areas of high resolution the model shows good agreement with previously published interpretations of seismic refraction and reflection experiments. However, the model shows relatively little lateral variation in seismic velocity except at shallow depths, where sedimentary basins such as the Midland Valley are apparent. At greater depths, higher velocities in the northwest parts of the model suggest that the thickness of crust increases towards the south and east. This observation is also in agreement with previous studies. Quarry blasts used as ground truth events and relocated with the preferred 3-D model are shown to be markedly more accurate than when located with the existing BGS 1-D velocity model.

  7. Aftershock communication during the Canterbury Earthquakes, New Zealand: implications for response and recovery in the built environment

    USGS Publications Warehouse

    Julia Becker,; Wein, Anne; Sally Potter,; Emma Doyle,; Ratliff, Jamie L.

    2015-01-01

    On 4 September 2010, a Mw7.1 earthquake occurred in Canterbury, New Zealand. Following the initial earthquake, an aftershock sequence was initiated, with the most significant aftershock being a Mw6.3 earthquake occurring on 22 February 2011. This aftershock caused severe damage to the city of Christchurch and building failures that killed 185 people. During the aftershock sequence it became evident that effective communication of aftershock information (e.g., history and forecasts) was imperative to assist with decision making during the response and recovery phases of the disaster, as well as preparedness for future aftershock events. As a consequence, a joint JCDR-USGS research project was initiated to investigate: • How aftershock information was communicated to organisations and to the public; • How people interpreted that information; • What people did in response to receiving that information; • What information people did and did not need; and • What decision-making challenges were encountered relating to aftershocks. Research was conducted by undertaking focus group meetings and interviews with a range of information providers and users, including scientists and science advisors, emergency managers and responders, engineers, communication officers, businesses, critical infrastructure operators, elected officials, and the public. The interviews and focus group meetings were recorded and transcribed, and key themes were identified. This paper focuses on the aftershock information needs for decision-making about the built environment post-earthquake, including those involved in response (e.g., for building assessment and management), recovery/reduction (e.g., the development of new building standards), and readiness (e.g. between aftershocks). The research has found that the communication of aftershock information varies with time, is contextual, and is affected by interactions among roles, by other information, and by decision objectives. A number

  8. Human casualties in earthquakes: Modelling and mitigation

    USGS Publications Warehouse

    Spence, R.J.S.; So, E.K.M.

    2011-01-01

    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  9. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  10. Global Review of Induced and Triggered Earthquakes

    NASA Astrophysics Data System (ADS)

    Foulger, G. R.; Wilson, M.; Gluyas, J.; Julian, B. R.; Davies, R. J.

    2016-12-01

    Natural processes associated with very small incremental stress changes can modulate the spatial and temporal occurrence of earthquakes. These processes include tectonic stress changes, the migration of fluids in the crust, Earth tides, surface ice and snow loading, heavy rain, atmospheric pressure, sediment unloading and groundwater loss. It is thus unsurprising that large anthropogenic projects which may induce stress changes of a similar size also modulate seismicity. As human development accelerates and industrial projects become larger in scale and more numerous, the number of such cases is increasing. That mining and water-reservoir impoundment can induce earthquakes has been accepted for several decades. Now, concern is growing about earthquakes induced by activities such as hydraulic fracturing for shale-gas extraction and waste-water disposal via injection into boreholes. As hydrocarbon reservoirs enter their tertiary phases of production, seismicity may also increase there. The full extent of human activities thought to induce earthquakes is, however, much wider than generally appreciated. We have assembled as near complete a catalog as possible of cases of earthquakes postulated to have been induced by human activity. Our database contains a total of 705 cases and is probably the largest compilation made to date. We include all cases where reasonable arguments have been made for anthropogenic induction, even where these have been challenged in later publications. Our database presents the results of our search but leaves judgment about the merits of individual cases to the user. We divide anthropogenic earthquake-induction processes into: a) Surface operations, b) Extraction of mass from the subsurface, c) Introduction of mass into the subsurface, and d) Explosions. Each of these categories is divided into sub-categories. In some cases, categorization of a particular case is tentative because more than one anthropogenic activity may have preceded or been

  11. Hidden Earthquake Potential in Plate Boundary Transition Zones

    NASA Astrophysics Data System (ADS)

    Furlong, Kevin P.; Herman, Matthew; Govers, Rob

    2017-04-01

    Plate boundaries can exhibit spatially abrupt changes in their long-term tectonic deformation (and associated kinematics) at triple junctions and other sites of changes in plate boundary structure. How earthquake behavior responds to these abrupt tectonic changes is unclear. The situation may be additionally obscured by the effects of superimposed deformational signals - juxtaposed short-term (earthquake cycle) kinematics may combine to produce a net deformational signal that does not reflect intuition about the actual strain accumulation in the region. Two examples of this effect are in the vicinity of the Mendocino triple junction (MTJ) along the west coast of North America, and at the southern end of the Hikurangi subduction zone, New Zealand. In the region immediately north of the MTJ, GPS-based observed crustal displacements (relative to North America (NAm)) are intermediate between Pacific and Juan de Fuca (JdF) motions. With distance north, these displacements rotate to become more aligned with JdF - NAm displacements, i.e. to motions expected along a coupled subduction interface. The deviation of GPS motions from the coupled subduction interface signal near the MTJ has been previously interpreted to reflect clock-wise rotation of a coastal, crustal block and/or reduced coupling at the southern Cascadia margin. The geologic record of crustal deformation near the MTJ reflects the combined effects of northward crustal shortening (on geologic time scales) associated with the MTJ Crustal Conveyor (Furlong and Govers, 1999) overprinted onto the subduction earthquake cycle signal. With this interpretation, the Cascadia subduction margin appears to be well-coupled along its entire length, consistent with paleo-seismic records of large earthquake ruptures extending to its southern limit. At the Hikurangi to Alpine Fault transition in New Zealand, plate interactions switch from subduction to oblique translation as a consequence of changes in lithospheric structure of

  12. Earthquakes, September-October 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region. 

  13. Earthquakes, March-April 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    Two major earthquakes (7.0-7.9) occurred during this reporting period: a magnitude 7.6 in Costa Rica on April 22 and a magntidue 7.0 in the USSR on April 29. Destructive earthquakes hit northern Peru on April 4 and 5. There were no destructive earthquakes in the United States during this period. 

  14. Can We Predict Earthquakes?

    ScienceCinema

    Johnson, Paul

    2018-01-16

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  15. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  16. Earthquakes, September-October 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0-7.9) during this reporting period. the first was in the Solomon Islands on October 14 and the second was in India on October 19. Earthquake-related deaths were reported in Guatemala and India. Htere were no significant earthquakes in the United States during the period covered in this report. 

  17. Accuracy and Resolution in Micro-earthquake Tomographic Inversion Studies

    NASA Astrophysics Data System (ADS)

    Hutchings, L. J.; Ryan, J.

    2010-12-01

    Accuracy and resolution are complimentary properties necessary to interpret the results of earthquake location and tomography studies. Accuracy is the how close an answer is to the “real world”, and resolution is who small of node spacing or earthquake error ellipse one can achieve. We have modified SimulPS (Thurber, 1986) in several ways to provide a tool for evaluating accuracy and resolution of potential micro-earthquake networks. First, we provide synthetic travel times from synthetic three-dimensional geologic models and earthquake locations. We use this to calculate errors in earthquake location and velocity inversion results when we perturb these models and try to invert to obtain these models. We create as many stations as desired and can create a synthetic velocity model with any desired node spacing. We apply this study to SimulPS and TomoDD inversion studies. “Real” travel times are perturbed with noise and hypocenters are perturbed to replicate a starting location away from the “true” location, and inversion is performed by each program. We establish travel times with the pseudo-bending ray tracer and use the same ray tracer in the inversion codes. This, of course, limits our ability to test the accuracy of the ray tracer. We developed relationships for the accuracy and resolution expected as a function of the number of earthquakes and recording stations for typical tomographic inversion studies. Velocity grid spacing started at 1km, then was decreased to 500m, 100m, 50m and finally 10m to see if resolution with decent accuracy at that scale was possible. We considered accuracy to be good when we could invert a velocity model perturbed by 50% back to within 5% of the original model, and resolution to be the size of the grid spacing. We found that 100 m resolution could obtained by using 120 stations with 500 events, bu this is our current limit. The limiting factors are the size of computers needed for the large arrays in the inversion and a

  18. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  19. Influence of the Saros Fault on the Periodicity of Earthquake Activity (Gelibolu Peninsula, NW Turkey)

    NASA Astrophysics Data System (ADS)

    İpek Gültekin, Derya; Karakoç, Okan; Şahin, Murat; Elitez, İrem; Yaltırak, Cenk

    2017-04-01

    Active faults are vital in terms of settlement and socio-economic aspects of a region. For this reason, it is important to determine the characteristics and impact areas of active faults correctly. The Marmara region is a tectonically active region located in the northwestern Anatolia. The northern part of the North Anatolian Fault, which was named the Saros Fault, passes through the westernmost part of this region. The Saros Fault is a 52 km-long and NE-SW-trending right-lateral strike-slip fault. In this study, the seismicity of the Gelibolu Peninsula has been examined in the light of historical records. When considering the historical records, 545, 986, 1354 and 1756 earthquakes led to damage on the settlements close to the Saros Fault. The dates of historical earthquakes were calculated by integration of previously published empirical formulas, year difference between events and velocity of GPS vectors. The acceleration map (PGA MAPS) of the region has been produced by taking into account these earthquake magnitudes, fault geometry and geology of the region, and consequently, it was seen that these maps overlap quite well with the damage records of historical earthquakes. Considering the periodicity of the Saros Fault, which majorly controls the seismicity in the region, it is aimed to find an answer to the question "how does a recent earthquake affect the region?" by the help of historical earthquake records and PGA modelling. In conclusion, our data showed that PGA values are dominant in the northern side of the Gelibolu Peninsula and this region may be affected by a magnitude 7.3 earthquake.

  20. Digital radiography of crush thoracic trauma in the Sichuan earthquake

    PubMed Central

    Dong, Zhi-Hui; Shao, Heng; Chen, Tian-Wu; Chu, Zhi-Gang; Deng, Wen; Tang, Si-Shi; Chen, Jing; Yang, Zhi-Gang

    2011-01-01

    AIM: To investigate the features of crush thoracic trauma in Sichuan earthquake victims using chest digital radiography (CDR). METHODS: We retrospectively reviewed 772 CDR of 417 females and 355 males who had suffered crush thoracic trauma in the Sichuan earthquake. Patient age ranged from 0.5 to 103 years. CDR was performed between May 12, 2008 and June 7, 2008. We looked for injury to the thoracic cage, pulmonary parenchyma and the pleura. RESULTS: Antero-posterior (AP) and lateral CDR were obtained in 349 patients, the remaining 423 patients underwent only AP CDR. Thoracic cage fractures, pulmonary contusion and pleural injuries were noted in 331 (42.9%; 95% CI: 39.4%-46.4%), 67 and 135 patients, respectively. Of the 256 patients with rib fractures, the mean number of fractured ribs per patient was 3. Rib fractures were mostly distributed from the 3rd through to the 8th ribs and the vast majority involved posterior and lateral locations along the rib. Rib fractures had a significant positive association with non-rib thoracic fractures, pulmonary contusion and pleural injuries (P < 0.001). The number of rib fractures and pulmonary contusions were significant factors associated with patient death. CONCLUSION: Earthquake-related crush thoracic trauma has the potential for multiple fractures. The high number of fractured ribs and pulmonary contusions were significant factors which needed appropriate medical treatment. PMID:22132298

  1. Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  2. The 2008 earthquakes in the Bavarian Molasse Basin - possible relation to deep geothermics?

    NASA Astrophysics Data System (ADS)

    Kraft, T.; Wassermann, J.; Deichmann, N.; Stange, S.

    2009-04-01

    We discuss several microearthquakes of magnitude up to Ml=2.3 that occurred in the Bavarian Molasse Basin (ByM), south of Munich, Germany, in February and July 2008. The strongest event was felt by local residents. The Bavarian Earthquake catalog, which dates back to the year 1000, does list a small number of isolated earthquakes in the western part of the ByM as well as a cluster of mining induced earthquakes (Peißenberg 1962-1970, I0(MSK)=5.5). The eastern part of the ByM, including the wider surrounding of Munich, was so far considered aseismic. Due to the spatio-temporal clustering of the microearthquakes in February and July 2008 the University of Munich (LMU) and the Swiss Seismologcical Service installed a temporal network of seismological stations in the south of Munich to investigate the newly arising seismicity. First analysis of the recorded data indicate shallow source depths (~5km) for the July events. This result is supported by the fact that one of these very small earthquakes was felt by local residents. The earthquakes hypocenters are located closely to a number of deep geothermal wells of 3-4.5km depth being either in production or running productivity tests in late 2007 and early 2008. Therefore, the 2008 seimicity might represent a case of induced seimicity related to the injection or withdrawal of water from the hydrothermal aquifer. Due to the lack of high quality recordings of a denser seismic monitoring network in the source area it is not possible to resolve details of the processes behind the 2008 seismicity. Therefore, a definite answer to the question if the earthquakes are related the deep geothermal projects or not can not be given at present. However, a number of recent well-studied cases have proved that earthquakes can also happen in depths much shallower than 5km, and that small changes of the hydrological conditions at depth are sufficient to trigger seismicity. Therefore, a detailed understanding of the causative processes

  3. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    NASA Astrophysics Data System (ADS)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  4. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5number earthquakes increased rapidly. An example of a human-triggered earthquake is the 1989 Newcastle event in Australia that was a result of almost 200 years of coal mining and water over-exploitation, respectively. This earthquake, an Mw=5.6 event, caused more than 3.5 billion U.S. dollars in damage (1989 value) and was responsible for Australia's first and only to date earthquake fatalities. It is therefore thought that, the Newcastle region tends to develop unsustainably if comparing economic growth due to mining and financial losses of triggered earthquakes. An hazard assessment, based on a geomechanical crust model, shows that only four deep coal mines were responsible for triggering this severe earthquake. A small-scale economic risk assessment identifies that the financial loss due to earthquake damage has reduced mining profits that have been re-invested in the Newcastle region for over two centuries beginning in 1801. Furthermore, large-scale economic risk assessment reveals that the financial loss is equivalent to 26% of the Australian Gross Domestic Product (GDP) growth in 1988/89. These costs account for 13% of the total costs of all natural disasters (e.g., flooding, drought, wild fires) and 94% of the costs of all

  5. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  6. A post-Tohoku earthquake review of earthquake probabilities in the Southern Kanto District, Japan

    NASA Astrophysics Data System (ADS)

    Somerville, Paul G.

    2014-12-01

    The 2011 Mw 9.0 Tohoku earthquake generated an aftershock sequence that affected a large part of northern Honshu, and has given rise to widely divergent forecasts of changes in earthquake occurrence probabilities in northern Honshu. The objective of this review is to assess these forecasts as they relate to potential changes in the occurrence probabilities of damaging earthquakes in the Kanto Region. It is generally agreed that the 2011 Mw 9.0 Tohoku earthquake increased the stress on faults in the southern Kanto district. Toda and Stein (Geophys Res Lett 686, 40: doi:10.1002, 2013) further conclude that the probability of earthquakes in the Kanto Corridor has increased by a factor of 2.5 for the time period 11 March 2013 to 10 March 2018 in the Kanto Corridor. Estimates of earthquake probabilities in a wider region of the Southern Kanto District by Nanjo et al. (Geophys J Int, doi:10.1093, 2013) indicate that any increase in the probability of earthquakes is insignificant in this larger region. Uchida et al. (Earth Planet Sci Lett 374: 81-91, 2013) conclude that the Philippine Sea plate the extends well north of the northern margin of Tokyo Bay, inconsistent with the Kanto Fragment hypothesis of Toda et al. (Nat Geosci, 1:1-6,2008), which attributes deep earthquakes in this region, which they term the Kanto Corridor, to a broken fragment of the Pacific plate. The results of Uchida and Matsuzawa (J Geophys Res 115:B07309, 2013)support the conclusion that fault creep in southern Kanto may be slowly relaxing the stress increase caused by the Tohoku earthquake without causing more large earthquakes. Stress transfer calculations indicate a large stress transfer to the Off Boso Segment as a result of the 2011 Tohoku earthquake. However, Ozawa et al. (J Geophys Res 117:B07404, 2012) used onshore GPS measurements to infer large post-Tohoku creep on the plate interface in the Off-Boso region, and Uchida and Matsuzawa (ibid.) measured similar large creep off the Boso

  7. After Fukushima: managing the consequences of a radiological release.

    PubMed

    Fitzgerald, Joe; Wollner, Samuel B; Adalja, Amesh A; Morhard, Ryan; Cicero, Anita; Inglesby, Thomas V

    2012-06-01

    Even amidst the devastation following the earthquake and tsunami in Japan that killed more than 20,000 people, it was the accident at the Fukushima Daiichi nuclear power plant that led the country's prime minister, Naoto Kan, to fear for "the very existence of the Japanese nation." While accidents that result in mass radiological releases have been rare throughout the operating histories of existing nuclear power plants, the growing number of plants worldwide increases the likelihood that such releases will occur again in the future. Nuclear power is an important source of energy in the U.S. and will be for the foreseeable future. Accidents far smaller in scale than the one in Fukushima could have major societal consequences. Given the extensive, ongoing Nuclear Regulatory Commission (NRC) and industry assessment of nuclear power plant safety and preparedness issues, the Center for Biosecurity of UPMC focused on offsite policies and plans intended to reduce radiation exposure to the public in the aftermath of an accident. This report provides an assessment of Japan's efforts at nuclear consequence management; identifies concerns with current U.S. policies and practices for "outside the fence" management of such an event in the U.S.; and makes recommendations for steps that can be taken to strengthen U.S. government, industry, and community response to large-scale accidents at nuclear power plants.

  8. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  9. Potentially induced earthquakes during the early twentieth century in the Los Angeles Basin

    USGS Publications Warehouse

    Hough, Susan E.; Page, Morgan T.

    2016-01-01

    Recent studies have presented evidence that early to mid‐twentieth‐century earthquakes in Oklahoma and Texas were likely induced by fossil fuel production and/or injection of wastewater (Hough and Page, 2015; Frohlich et al., 2016). Considering seismicity from 1935 onward, Hauksson et al. (2015) concluded that there is no evidence for significant induced activity in the greater Los Angeles region between 1935 and the present. To explore a possible association between earthquakes prior to 1935 and oil and gas production, we first revisit the historical catalog and then review contemporary oil industry activities. Although early industry activities did not induce large numbers of earthquakes, we present evidence for an association between the initial oil boom in the greater Los Angeles area and earthquakes between 1915 and 1932, including the damaging 22 June 1920 Inglewood and 8 July 1929 Whittier earthquakes. We further consider whether the 1933 Mw 6.4 Long Beach earthquake might have been induced, and show some evidence that points to a causative relationship between the earthquake and activities in the Huntington Beach oil field. The hypothesis that the Long Beach earthquake was either induced or triggered by an foreshock cannot be ruled out. Our results suggest that significant earthquakes in southern California during the early twentieth century might have been associated with industry practices that are no longer employed (i.e., production without water reinjection), and do not necessarily imply a high likelihood of induced earthquakes at the present time.

  10. Mass aeromedical evacuation of patients in an emergency: experience following the 2010 Yushu earthquake.

    PubMed

    Liu, Xu; Liu, Yuan; Zhang, Lulu; Liang, Wannian; Zhu, Zenghong; Shen, Yan; Kang, Peng; Liu, Zhipeng

    2013-12-01

    On April 14, 2010, a catastrophic earthquake hit Yushu, China, causing 2698 deaths and 12,135 injuries. A large number of patients were evacuated by air to hospitals in unaffected areas for specialty treatment. To investigate the overall process and details of patients' aeromedical evacuation (AE) after the Yushu earthquake. The study was an observational, retrospective investigation conducted in December 2010 in Qinghai province. Information was gathered from Yushu Batang airport, the Ministry of Health, the Health Department of Qinghai Province, and rear echelon hospitals in five provinces. A total of 2796 patients were evacuated by 152 separate flights from Yushu. The number of AE patients reached a peak (55.8%) within 72 h after the earthquake. Of the total 2796 patients, 2533 were admitted to rear echelon hospitals. This number included 2111 (83.3%) with earthquake-related trauma, 422 (26.7%) with non-traumatic diseases, and 166 (6.6%) with acute mountain sickness. No accident or medical error was reported during the evacuation process. The aircraft used for AE included IL-76 transport aircraft from the Air Force, Airbus A-319s from civil aviation, and MI-17 helicopters from Army aviation. According to our investigation, the need for professional AE training was great (83.7%). In addition, almost all participants (99.3%) agreed that the aircraft needed to be improved for the purpose of AE. Aeromedical evacuation of a large number of patients after major disasters in remote areas can be done safely and effectively; however, problems such as a lack of suitable AE aircraft and medical equipment, as well as insufficient professional medical training in AE, were revealed after the Yushu earthquake. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Case studies of Induced Earthquakes in Ohio for 2016 and 2017

    NASA Astrophysics Data System (ADS)

    Friberg, P. A.; Brudzinski, M.; Kozlowska, M.; Loughner, E.; Langenkamp, T.; Dricker, I.

    2017-12-01

    Over the last four years, unconventional oil and gas production activity in the Utica shale play in Ohio has induced over 20 earthquake sequences (Friberg et al, 2014; Skoumal et al, 2016; Friberg et al, 2016; Kozlowska et al, in submission) including a few new ones in 2017. The majority of the induced events have been attributed to optimally oriented faults located in crystalline basement rocks, which are closer to the Utica formation than the Marcellus shale, a shallower formation more typically targeted in Pennsylvania and West Virginia. A number of earthquake sequences in 2016 and 2017 are examined using multi-station cross correlation template matching techniques. We examine the Gutenberg-Richter b-values and, where possible, the b-value evolution of the earthquake sequences to help determine seismogensis of the events. Refined earthquake locations using HypoDD are determined using data from stations operated by the USGS, IRIS, ODNR, Miami University, and PASEIS.

  12. Spatial organization of foreshocks as a tool to forecast large earthquakes.

    PubMed

    Lippiello, E; Marzocchi, W; de Arcangelis, L; Godano, C

    2012-01-01

    An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg(2)), with significant probability gains with respect to standard models.

  13. Spatial organization of foreshocks as a tool to forecast large earthquakes

    PubMed Central

    Lippiello, E.; Marzocchi, W.; de Arcangelis, L.; Godano, C.

    2012-01-01

    An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg2), with significant probability gains with respect to standard models. PMID:23152938

  14. Shallow megathrust earthquake ruptures betrayed by their outer-trench aftershocks signature

    NASA Astrophysics Data System (ADS)

    Sladen, Anthony; Trevisan, Jenny

    2018-02-01

    For some megathrust earthquakes, the rupture extends to the solid Earth's surface, at the ocean floor. This unexpected behaviour holds strong implications for the tsunami potential of subduction zones and for the physical conditions governing earthquakes, but such ruptures occur in underwater areas which are hard to observe, even with current instrumentation and imaging techniques. Here, we evidence that aftershocks occurring ocean-ward from the trench are conditioned by near-surface rupture of the megathrust fault. Comparison to well constrained earthquake slip models further reveals that for each event the number of aftershocks is proportional to the amount of shallow slip, a link likely related to static stress transfer. Hence, the spatial distribution of these specific aftershock sequences could provide independent constrains on the coseismic shallow slip of future events. It also offers the prospect to be able to reassess the rupture of many large subduction earthquakes back to the beginning of the instrumental era.

  15. Source discrimination between Mining blasts and Earthquakes in Tianshan orogenic belt, NW China

    NASA Astrophysics Data System (ADS)

    Tang, L.; Zhang, M.; Wen, L.

    2017-12-01

    In recent years, a large number of quarry blasts have been detonated in Tianshan Mountains of China. It is necessary to discriminate those non-earthquake records from the earthquake catalogs in order to determine the real seismicity of the region. In this study, we have investigated spectral ratios and amplitude ratios as discriminants for regional seismic-event identification using explosions and earthquakes recorded at Xinjiang Seismic Network (XJSN) of China. We used a data set that includes 1071 earthquakes and 2881 non-earthquakes as training data recorded by the XJSN between years of 2009 and 2016, with both types of events in a comparable local magnitude range (1.5 to 2.9). The non-earthquake and earthquake groups were well separated by amplitude ratios of Pg/Sg, with the separation increasing with frequency when averaged over three stations. The 8- to 15-Hz Pg/Sg ratio was proved to be the most precise and accurate discriminant, which works for more than 90% of the events. In contrast, the P spectral ratio performed considerably worse with a significant overlap (about 60% overlap) between the earthquake and explosion populations. The comparison results show amplitude ratios between compressional and shear waves discriminate better than low-frequency to high-frequency spectral ratios for individual phases. In discriminating between explosions and earthquakes, none of two discriminants were able to completely separate the two populations of events. However, a joint discrimination scheme employing simple majority voting reduces misclassifications to 10%. In the region of the study, 44% of the examined seismic events were determined to be non-earthquakes and 55% to be earthquakes. The earthquakes occurring on land are related to small faults, while the blasts are concentrated in large quarries.

  16. Continuing Megathrust Earthquake Potential in northern Chile after the 2014 Iquique Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Hayes, G. P.; Herman, M. W.; Barnhart, W. D.; Furlong, K. P.; Riquelme, S.; Benz, H.; Bergman, E.; Barrientos, S. E.; Earle, P. S.; Samsonov, S. V.

    2014-12-01

    The seismic gap theory, which identifies regions of elevated hazard based on a lack of recent seismicity in comparison to other portions of a fault, has successfully explained past earthquakes and is useful for qualitatively describing where future large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which until recently had not ruptured in a megathrust earthquake since a M~8.8 event in 1877. On April 1 2014, a M 8.2 earthquake occurred within this northern Chile seismic gap, offshore of the city of Iquique; the size and spatial extent of the rupture indicate it was not the earthquake that had been anticipated. Here, we present a rapid assessment of the seismotectonics of the March-April 2014 seismic sequence offshore northern Chile, including analyses of earthquake (fore- and aftershock) relocations, moment tensors, finite fault models, moment deficit calculations, and cumulative Coulomb stress transfer calculations over the duration of the sequence. This ensemble of information allows us to place the current sequence within the context of historic seismicity in the region, and to assess areas of remaining and/or elevated hazard. Our results indicate that while accumulated strain has been released for a portion of the northern Chile seismic gap, significant sections have not ruptured in almost 150 years. These observations suggest that large-to-great sized megathrust earthquakes will occur north and south of the 2014 Iquique sequence sooner than might be expected had the 2014 events ruptured the entire seismic gap.

  17. Unusually large earthquakes inferred from tsunami deposits along the Kuril trench

    USGS Publications Warehouse

    Nanayama, F.; Satake, K.; Furukawa, R.; Shimokawa, K.; Atwater, B.F.; Shigeno, K.; Yamaki, S.

    2003-01-01

    The Pacific plate converges with northeastern Eurasia at a rate of 8-9 m per century along the Kamchatka, Kuril and Japan trenches. Along the southern Kuril trench, which faces the Japanese island of Hokkaido, this fast subduction has recurrently generated earthquakes with magnitudes of up to ???8 over the past two centuries. These historical events, on rupture segments 100-200 km long, have been considered characteristic of Hokkaido's plate-boundary earthquakes. But here we use deposits of prehistoric tsunamis to infer the infrequent occurrence of larger earthquakes generated from longer ruptures. Many of these tsunami deposits form sheets of sand that extend kilometres inland from the deposits of historical tsunamis. Stratigraphic series of extensive sand sheets, intercalated with dated volcanic-ash layers, show that such unusually large tsunamis occurred about every 500 years on average over the past 2,000-7,000 years, most recently ???350 years ago. Numerical simulations of these tsunamis are best explained by earthquakes that individually rupture multiple segments along the southern Kuril trench. We infer that such multi-segment earthquakes persistently recur among a larger number of single-segment events.

  18. Long-Period Ground Motion due to Near-Shear Earthquake Ruptures

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Yokota, Y.; Hikima, K.

    2010-12-01

    Long-period ground motion has become an increasingly important consideration because of the recent rapid increase in the number of large-scale structures, such as high-rise buildings and large oil storage tanks. Large subduction-zone earthquakes and moderate to large crustal earthquakes can generate far-source long-period ground motions in distant sedimentary basins with the help of path effects. Near-fault long-period ground motions are generated, for the most part, by the source effects of forward rupture directivity (Koketsu and Miyake, 2008). This rupture directivity effect is the maximum in the direction of fault rupture when a rupture velocity is nearly equal to shear wave velocity around a source fault (Dunham and Archuleta, 2005). The near-shear rupture was found to occur during the 2008 Mw 7.9 Wenchuan earthquake at the eastern edge of the Tibetan plateau (Koketsu et al., 2010). The variance of waveform residuals in a joint inversion of teleseismic and strong motion data was the minimum when we adopted a rupture velocity of 2.8 km/s, which is close to the shear wave velocity of 2.6 km/s around the hypocenter. We also found near-shear rupture during the 2010 Mw 6.9 Yushu earthquake (Yokota et al., 2010). The optimum rupture velocity for an inversion of teleseismic data is 3.5 km/s, which is almost equal to the shear wave velocity around the hypocenter. Since, in addition, supershear rupture was found during the 2001 Mw 7.8 Central Kunlun earthquake (Bouchon and Vallee, 2003), such fast earthquake rupture can be a characteristic of the eastern Tibetan plateau. Huge damage in Yingxiu and Beichuan from the 2008 Wenchuan earthquake and damage heavier than expected in the county seat of Yushu from the medium-sized Yushu earthquake can be attributed to the maximum rupture directivity effect in the rupture direction due to near-shear earthquake ruptures.

  19. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  20. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  1. Four Examples of Short-Term and Imminent Prediction of Earthquakes

    NASA Astrophysics Data System (ADS)

    zeng, zuoxun; Liu, Genshen; Wu, Dabin; Sibgatulin, Victor

    2014-05-01

    channel. In this way, the Minxian earthquake is successfully predicted. Short-Term and Medium-Term Prediction of Nantou Earthquake (Ms6.7) According to the analysis of the solar activity taking the sunspot number as the main factor, the two-time method of magnetic storm related to lunar phase, the analysis of the earthquake cloud and the intersection point of magnetic anomalies, We predicted that there would be an earthquake with magnitude of ML6.2 ± 0.4 occurring in Taiwan (24 ± 0.2, 121 ± 0.2E) before the mid-month of June, 2013. On May 27th, 2013, Jianwen Huang raised the alarm that the stresses were concentrating in Nantou county of Taiwan. On May 29th, he further raised the special alarm that intensive stresses were continuously concentrating. At 12:34 of June 2nd, 2013, a Ms6.7 (ML6.3) earthquake occurred in Nantou of Taiwan with the epicenter at (23.87N, 121.00E). Short-Term Prediction of Dujiangyan Earthquake (Ms4.1) At 17:16, June 3rd, 2013, on the basis of comprehensive analysis of the correspondence between cloud and ground observation by Dabing Wu, the strip degasification along the north part of the Longmenshan fracture zone and the satellite gravity anomalies of the area, Zuoxun Zeng made a prediction that the epicenter would be in (31N, 104E), the magnitude would be in Ms5.5 ± 0.5, the occurring time would be in 2 months. At 7:39, July 8th, 2013, an earthquake occurred at the border between Dujiangyan city and Wenchuan county (31.3N, 03.6E) with magnitude of Ms4.1. We name it Dujiangyan earthquake in the article.

  2. Some more earthquakes from medieval Kashmir

    NASA Astrophysics Data System (ADS)

    Ahmad, Bashir; Shafi, Muzamil

    2014-07-01

    Kashmir has the peculiarity of having written history of almost 5,000 years. However, the description of earthquakes in the archival contents is patchy prior to 1500 a.d. Moreover, recent search shows that there exist certain time gaps in the catalogs presently in use especially at medieval level (1128-1586 a.d.). The presence of different ruling elites in association with socioeconomic and political conditions has in many ways confused the historical context of the medieval sources. However, by a meticulous review of the Sanskrit sources (between the twelfth and sixteenth century), it has been possible to identify unspecified but fair number (eight seismic events) of earthquakes that do not exist in published catalogs of Kashmir or whose dates are very difficult to establish. Moreover, historical sources reveal that except for events which occurred during Sultan Skinder's rule (1389-1413) and during the reign of King Zain-ul-Abidin (1420-1470), all the rediscovered seismic events went into oblivion, due mainly to the fact that the sources available dedicated their interests to the military events, which often tended to overshadow/superimpose over and even concealed natural events like earthquakes, resulting in fragmentary accounts and rendering them of little value for macroseismic intensity evaluation necessary for more efficient seismic hazard assessment.

  3. Unrevealing the History of Earthquakes and Tsunamis of the Mexican Subduction Zone

    NASA Astrophysics Data System (ADS)

    Ramirez-Herrera, M. T.; Castillo-Aja, M. D. R.; Cruz, S.; Corona, N.; Rangel Velarde, V.; Lagos, M.

    2014-12-01

    The great earthquakes and tsunamis of the last decades in Sumatra, Chile, and Japan remind us of the need for expanding the record of history of such catastrophic events. It can't be argued that even countries with extensive historical documents and tsunami sand deposits still have unsolved questions on the frequency of them, and the variables that control them along subduction zones. We present here preliminary results of a combined approach using historical archives and multiple proxies of the sedimentary record to unrevealing the history of possible great earthquakes and their tsunamis on the Mexican Subduction zone. The Mexican subduction zone extends over 1000 km long and little is known if the entire subduction zone along the Middle American Trench behaves as one enormous unit rather than in segments that rupture at different frequencies and with different strengths (as the short instrumental record shows). We searched on historical archives and earthquake databases to distinguish tsunamigenic events registered from the 16th century to now along the Jalisco-Colima and Guerrero-Oaxaca coastal stretches. The historical data referred are mostly from the 19th century on since the population on the coast was scarce before. We found 21 earthquakes with tsunamigenic potential, and of those 16 with doubtful to definitive accompanying tsunami on the Jalisco-Colima coast, and 31 tsunamigenic earthquakes on the Oaxaca-Guerrero coast. Evidence of great earthquakes and their tsunamis from the sedimentary record are scarce, perhaps due poor preservation of tsunami deposits in this tropical environment. Nevertheless, we have found evidence for a number of tsunamigenic events, both historical and prehistorical, 1932 and 1400 AD on Jalisco, and 3400 BP, 1789 AD, 1979 ad, and 1985 AD on Guerrero-Oaxaca. We continue working and a number of events are still to be dated. This work would aid in elucidating the history of earthquakes and tsunamis on the Mexican subduction zone.

  4. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    NASA Astrophysics Data System (ADS)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  5. On the feedback between forearc morphotectonics and megathrust earthquakes in subduction zones

    NASA Astrophysics Data System (ADS)

    Rosenau, M.; Oncken, O.

    2008-12-01

    An increasing number of observations suggest an intrinsic relationship between short- and long-term deformation processes in subduction zones. These include the global correlation between megathrust earthquake slip patterns with morphotectonic forearc features, the historical predominance of giant earthquakes (M > 9) along accretionary margins and the occurrence of (slow and shallow) tsunami earthquakes along erosive margins. To gain insight into the interplay between seismogenesis and tectonics in subduction settings we have developed a new modeling technique which joins analog and elastic dislocation approaches. Using elastoplastic wedges overlying a rate- and state-dependent interface, we demonstrate how analog earthquakes drive permanent wedge deformation consistent with the dynamic Coulomb wedge theory and how wedge deformation in turn controls basal "seismicity". During an experimental run, elastoplastic wedges evolve from those comparable to accretionary margins, characterized by plastic wedge shortening, to those mimicking erosive margins, characterized by minor plastic deformation. Permanent shortening localizes at the periphery of the "seismogenic" zone leading to a "morphotectonic" segmentation of the upper plate. Along with the evolving segmentation of the wedge, the magnitude- frequency relationship and recurrence distribution of analog earthquakes develop towards more periodic events of similar size (i.e. characteristic earthquakes). From the experiments we infer a positive feedback between short- and long-term deformation processes which tends to stabilize the spatiotemporal patterns of elastoplastic deformation in subduction settings. We suggest (1) that forearc anatomy reflects the distribution of seismic and aseismic slip at depth, (2) that morphotectonic segmentation assists the occurrence of more characteristic earthquakes, (3) that postseismic near-trench shortening relaxes coseismic compression by megathrust earthquakes and thus reduces

  6. Earthquakes triggered by silent slip events on Kīlauea volcano, Hawaii

    USGS Publications Warehouse

    Segall, Paul; Desmarais, Emily K.; Shelly, David; Miklius, Asta; Cervelli, Peter F.

    2006-01-01

    Slow-slip events, or ‘silent earthquakes’, have recently been discovered in a number of subduction zones including the Nankai trough1, 2, 3 in Japan, Cascadia4, 5, and Guerrero6 in Mexico, but the depths of these events have been difficult to determine from surface deformation measurements. Although it is assumed that these silent earthquakes are located along the plate megathrust, this has not been proved. Slow slip in some subduction zones is associated with non-volcanic tremor7, 8, but tremor is difficult to locate and may be distributed over a broad depth range9. Except for some events on the San Andreas fault10, slow-slip events have not yet been associated with high-frequency earthquakes, which are easily located. Here we report on swarms of high-frequency earthquakes that accompany otherwise silent slips on Kīlauea volcano, Hawaii. For the most energetic event, in January 2005, the slow slip began before the increase in seismicity. The temporal evolution of earthquakes is well explained by increased stressing caused by slow slip, implying that the earthquakes are triggered. The earthquakes, located at depths of 7–8 km, constrain the slow slip to be at comparable depths, because they must fall in zones of positive Coulomb stress change. Triggered earthquakes accompanying slow-slip events elsewhere might go undetected if background seismicity rates are low. Detection of such events would help constrain the depth of slow slip, and could lead to a method for quantifying the increased hazard during slow-slip events, because triggered events have the potential to grow into destructive earthquakes.

  7. Laboratory constraints on models of earthquake recurrence

    NASA Astrophysics Data System (ADS)

    Beeler, N. M.; Tullis, Terry; Junger, Jenni; Kilgore, Brian; Goldsby, David

    2014-12-01

    In this study, rock friction "stick-slip" experiments are used to develop constraints on models of earthquake recurrence. Constant rate loading of bare rock surfaces in high-quality experiments produces stick-slip recurrence that is periodic at least to second order. When the loading rate is varied, recurrence is approximately inversely proportional to loading rate. These laboratory events initiate due to a slip-rate-dependent process that also determines the size of the stress drop and, as a consequence, stress drop varies weakly but systematically with loading rate. This is especially evident in experiments where the loading rate is changed by orders of magnitude, as is thought to be the loading condition of naturally occurring, small repeating earthquakes driven by afterslip, or low-frequency earthquakes loaded by episodic slip. The experimentally observed stress drops are well described by a logarithmic dependence on recurrence interval that can be cast as a nonlinear slip predictable model. The fault's rate dependence of strength is the key physical parameter. Additionally, even at constant loading rate the most reproducible laboratory recurrence is not exactly periodic, unlike existing friction recurrence models. We present example laboratory catalogs that document the variance and show that in large catalogs, even at constant loading rate, stress drop and recurrence covary systematically. The origin of this covariance is largely consistent with variability of the dependence of fault strength on slip rate. Laboratory catalogs show aspects of both slip and time predictability, and successive stress drops are strongly correlated indicating a "memory" of prior slip history that extends over at least one recurrence cycle.

  8. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  9. Intermediate-term earthquake prediction

    USGS Publications Warehouse

    Knopoff, L.

    1990-01-01

    The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

  10. Earthquakes on Your Dinner Table

    NASA Astrophysics Data System (ADS)

    Alexeev, N. A.; Tape, C.; Alexeev, V. A.

    2016-12-01

    Earthquakes have interesting physics applicable to other phenomena like propagation of waves, also, they affect human lives. This study focused on three questions, how: depth, distance from epicenter and ground hardness affect earthquake strength. Experimental setup consisted of a gelatin slab to simulate crust. The slab was hit with a weight and earthquake amplitude was measured. It was found that earthquake amplitude was larger when the epicenter was deeper, which contradicts observations and probably was an artifact of the design. Earthquake strength was inversely proportional to the distance from the epicenter, which generally follows reality. Soft and medium jello were implanted into hard jello. It was found that earthquakes are stronger in softer jello, which was a result of resonant amplification in soft ground. Similar results are found in Minto Flats, where earthquakes are stronger and last longer than in the nearby hills. Earthquakes waveforms from Minto Flats showed that that the oscillations there have longer periods compared to the nearby hills with harder soil. Two gelatin pieces with identical shapes and different hardness were vibrated on a platform at varying frequencies in order to demonstrate that their resonant frequencies are statistically different. This phenomenon also occurs in Yukon Flats.

  11. Theatre as Therapy, Therapy as Theatre Transforming the Memories and Trauma of the 21 September 1999 Earthquake in Taiwan

    ERIC Educational Resources Information Center

    Chang, Ivy I-Chu

    2005-01-01

    On 21 September 1999, a 7.3 magnitude earthquake in Taiwan destroyed more than 100,000 houses, causing 2,294 deaths and 8,737 injuries. In the aftermath of the earthquake, a great number of social workers and cultural workers were thrust into Nantou County and Taichung County of central Taiwan, the epicentre of the earthquake, to assist the…

  12. Damaging earthquakes: A scientific laboratory

    USGS Publications Warehouse

    Hays, Walter W.; ,

    1996-01-01

    This paper reviews the principal lessons learned from multidisciplinary postearthquake investigations of damaging earthquakes throughout the world during the past 15 years. The unique laboratory provided by a damaging earthquake in culturally different but tectonically similar regions of the world has increased fundamental understanding of earthquake processes, added perishable scientific, technical, and socioeconomic data to the knowledge base, and led to changes in public policies and professional practices for earthquake loss reduction.

  13. Seismic Regime in the Vicinity of the 2011 Tohoku Mega Earthquake (Japan, M w = 9)

    NASA Astrophysics Data System (ADS)

    Rodkin, M. V.; Tikhonov, I. N.

    2014-12-01

    The 2011 Tohoku mega earthquake ( M w = 9) is unique due to a combination of its large magnitude and the high level of detail of regional seismic data. The authors analyzed the seismic regime in the vicinity of this event using data from the Japan Meteorological Agency catalog and world databases. It was shown that a regional decrease in b-value and of the number of main shocks took place in the 6-7 years prior to the Tohoku mega earthquake. The space-time area of such changes coincided with the development of precursor effects in this area, as revealed by Lyubushin (Geofiz Prots Biosfera 10:9-35, 2011) from the analysis of microseisms recorded by the broadband seismic network F-net in Japan. The combination of episodes of growth in the number of earthquakes, accompanied by a corresponding decrease in the b-value and average depth of the earthquakes, was observed for the foreshock and aftershock sequences of the 2011 Tohoku earthquake. Some of these anomalies were similar to those observed (also post factum) by Katsumata (Earth Planets Space 63:709-712, 2011), Nanjo et al. (Geophys Res Lett 39, 2012), and Huang and Ding (Bull Seismol Soc Am 102:1878-1883, 2012), whereas others were not described before. The correlation of the periods of growth in seismic activity with the decrease of the average depth of earthquakes can be explained by the growth of fluid activity and the tendency of a penetration of low density fluids into the upper horizons of the lithosphere. The unexpectedly strong Tohoku mega earthquake with a rather small rupture area caused an unexpectedly high tsunami wave. From here it seems plausible that M9+ earthquakes with a large tsunami could occur in other subduction zones where such cases were suggested before to be impossible.

  14. Sensing the earthquake

    NASA Astrophysics Data System (ADS)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  15. Discrepancy between earthquake rates implied by historic earthquakes and a consensus geologic source model for California

    USGS Publications Warehouse

    Petersen, M.D.; Cramer, C.H.; Reichle, M.S.; Frankel, A.D.; Hanks, T.C.

    2000-01-01

    We examine the difference between expected earthquake rates inferred from the historical earthquake catalog and the geologic data that was used to develop the consensus seismic source characterization for the state of California [California Department of Conservation, Division of Mines and Geology (CDMG) and U.S. Geological Survey (USGS) Petersen et al., 1996; Frankel et al., 1996]. On average the historic earthquake catalog and the seismic source model both indicate about one M 6 or greater earthquake per year in the state of California. However, the overall earthquake rates of earthquakes with magnitudes (M) between 6 and 7 in this seismic source model are higher, by at least a factor of 2, than the mean historic earthquake rates for both southern and northern California. The earthquake rate discrepancy results from a seismic source model that includes earthquakes with characteristic (maximum) magnitudes that are primarily between M 6.4 and 7.1. Many of these faults are interpreted to accommodate high strain rates from geologic and geodetic data but have not ruptured in large earthquakes during historic time. Our sensitivity study indicates that the rate differences between magnitudes 6 and 7 can be reduced by adjusting the magnitude-frequency distribution of the source model to reflect more characteristic behavior, by decreasing the moment rate available for seismogenic slip along faults, by increasing the maximum magnitude of the earthquake on a fault, or by decreasing the maximum magnitude of the background seismicity. However, no single parameter can be adjusted, consistent with scientific consensus, to eliminate the earthquake rate discrepancy. Applying a combination of these parametric adjustments yields an alternative earthquake source model that is more compatible with the historic data. The 475-year return period hazard for peak ground and 1-sec spectral acceleration resulting from this alternative source model differs from the hazard resulting from the

  16. GPS Technologies as a Tool to Detect the Pre-Earthquake Signals Associated with Strong Earthquakes

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.; Krankowski, A.; Hernandez-Pajares, M.; Liu, J. Y. G.; Hattori, K.; Davidenko, D.; Ouzounov, D.

    2015-12-01

    The existence of ionospheric anomalies before earthquakes is now widely accepted. These phenomena started to be considered by GPS community to mitigate the GPS signal degradation over the territories of the earthquake preparation. The question is still open if they could be useful for seismology and for short-term earthquake forecast. More than decade of intensive studies proved that ionospheric anomalies registered before earthquakes are initiated by processes in the boundary layer of atmosphere over earthquake preparation zone and are induced in the ionosphere by electromagnetic coupling through the Global Electric Circuit. Multiparameter approach based on the Lithosphere-Atmosphere-Ionosphere Coupling model demonstrated that earthquake forecast is possible only if we consider the final stage of earthquake preparation in the multidimensional space where every dimension is one from many precursors in ensemble, and they are synergistically connected. We demonstrate approaches developed in different countries (Russia, Taiwan, Japan, Spain, and Poland) within the framework of the ISSI and ESA projects) to identify the ionospheric precursors. They are also useful to determine the all three parameters necessary for the earthquake forecast: impending earthquake epicenter position, expectation time and magnitude. These parameters are calculated using different technologies of GPS signal processing: time series, correlation, spectral analysis, ionospheric tomography, wave propagation, etc. Obtained results from different teams demonstrate the high level of statistical significance and physical justification what gives us reason to suggest these methodologies for practical validation.

  17. Lessons learned from the total evacuation of a hospital after the 2016 Kumamoto Earthquake.

    PubMed

    Yanagawa, Youichi; Kondo, Hisayoshi; Okawa, Takashi; Ochi, Fumio

    The 2016 Kumamoto Earthquakes were a series of earthquakes that included a foreshock earthquake (magnitude 6.2) on April 14 and a main shock (magnitude 7.0) on April 16, 2016. A number of hospitals in Kumamoto were severely damaged by the two major earthquakes and required total evacuation. The authors retrospectively analyzed the activity data of the Disaster Medical Assistance Teams using the Emergency Medical Information System records to investigate the cases in which the total evacuation of a hospital was attempted following the 2016 Kumamoto Earthquake. Total evacuation was attempted at 17 hospitals. The evacuation of one of these hospitals was canceled. Most of the hospital buildings were more than 20 years old. The danger of collapse was the most frequent reason for evacuation. Various transportation methods were employed, some of which involved the Japan Ground Self-Defense Force; no preventable deaths occurred during transportation. The hospitals must now be renovated to improve their earthquake resistance. The coordinated and combined use of military and civilian resources is beneficial and can significantly reduce human suffering in large-scale disasters.

  18. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2016-12-01

    New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

  19. Earthquake-related versus non-earthquake-related injuries in spinal injury patients: differentiation with multidetector computed tomography

    PubMed Central

    2010-01-01

    Introduction In recent years, several massive earthquakes have occurred across the globe. Multidetector computed tomography (MDCT) is reliable in detecting spinal injuries. The purpose of this study was to compare the features of spinal injuries resulting from the Sichuan earthquake with those of non-earthquake-related spinal trauma using MDCT. Methods Features of spinal injuries of 223 Sichuan earthquake-exposed patients and 223 non-earthquake-related spinal injury patients were retrospectively compared using MDCT. The date of non-earthquake-related spinal injury patients was collected from 1 May 2009 to 22 July 2009 to avoid the confounding effects of seasonal activity and clothing. We focused on anatomic sites, injury types and neurologic deficits related to spinal injuries. Major injuries were classified according to the grid 3-3-3 scheme of the Magerl (AO) classification system. Results A total of 185 patients (82.96%) in the earthquake-exposed cohort experienced crush injuries. In the earthquake and control groups, 65 and 92 patients, respectively, had neurologic deficits. The anatomic distribution of these two cohorts was significantly different (P < 0.001). Cervical spinal injuries were more common in the control group (risk ratio (RR) = 2.12, P < 0.001), whereas lumbar spinal injuries were more common in the earthquake-related spinal injuries group (277 of 501 injured vertebrae; 55.29%). The major types of injuries were significantly different between these cohorts (P = 0.002). Magerl AO type A lesions composed most of the lesions seen in both of these cohorts. Type B lesions were more frequently seen in earthquake-related spinal injuries (RR = 1.27), while we observed type C lesions more frequently in subjects with non-earthquake-related spinal injuries (RR = 1.98, P = 0.0029). Conclusions Spinal injuries sustained in the Sichuan earthquake were located mainly in the lumbar spine, with a peak prevalence of type A lesions and a high occurrence of

  20. Earthquake Emergency Education in Dushanbe, Tajikistan

    ERIC Educational Resources Information Center

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  1. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  2. Post-Traumatic Stress Disorder and other mental disorders in the general population after Lorca's earthquakes, 2011 (Murcia, Spain): A cross-sectional study.

    PubMed

    Navarro-Mateu, Fernando; Salmerón, Diego; Vilagut, Gemma; Tormo, Mª José; Ruíz-Merino, Guadalupe; Escámez, Teresa; Júdez, Javier; Martínez, Salvador; Koenen, Karestan C; Navarro, Carmen; Alonso, Jordi; Kessler, Ronald C

    2017-01-01

    To describe the prevalence and severity of mental disorders and to examine differences in risk among those with and without a lifetime history prior to a moderate magnitude earthquake that took place in Lorca (Murcia, Spain) at roughly the mid-point (on May 11, 2011) of the time interval in which a regional epidemiological survey was already being carried out (June 2010 -May 2012). The PEGASUS-Murcia project is a cross-sectional face-to-face interview survey of a representative sample of non-institutionalized adults in Murcia. Main outcome measures are prevalence and severity of anxiety, mood, impulse and substance disorders in the 12 months previous to the survey, assessed using the Composite International Diagnostic Interview (CIDI 3.0). Sociodemographic variables, prior history of any mental disorder and earthquake-related stressors were entered as independent variables in a logistic regression analysis. A total number of 412 participants (response rate: 71%) were interviewed. Significant differences in 12-month prevalence of mental disorders were found in Lorca compared to the rest of Murcia for any (12.8% vs 16.8%), PTSD (3.6% vs 0.5%) and other anxiety disorders (5.3% vs 9.2%) (p≤ 0.05 for all). No differences were found for 12-month prevalence of any mood or any substance disorder. The two major predictors for developing a 12-month post-earthquake mental disorder were a prior mental disorder and the level of exposure. Other risk factors included female sex and low-average income. PTSD and other mental disorders are commonly associated with earthquake disasters. Prior mental disorders and the level of exposure to the earthquakes are the most important for the development of a consequent mental disorder and this recognition may help to identify those individuals that may most benefit from specific therapeutic intervention.

  3. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    USGS Publications Warehouse

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  4. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  5. Landward vergence in accretionary prism, evidence for frontal propagation of earthquakes?

    NASA Astrophysics Data System (ADS)

    cubas, Nadaya; Souloumiac, Pauline

    2016-04-01

    Landward vergence in accretionary wedges is rare and have been described at very few places: along the Cascadia subduction zone and more recently along Sumatra where the 2004 Mw 9.1 Sumatra-Andaman event and the 2011 tsunami earthquake occurred. Recent studies have suggested a relation between landward thrust faults and frontal propagation of earthquakes for the Sumatra subduction zone. The Cascadia subduction zone is also known to have produced in 1700 a Mw9 earthquake with a large tsunami across the Pacific. Based on mechanical analysis, we propose to investigate if specific frictional properties could lead to a landward sequence of thrusting. We show that landward thrust requires very low effective friction along the megathrust with a rather high internal effective friction. We also show that landward thrust appears close to the extensional critical limit. Along Cascadia and Sumatra, we show that to get landward vergence, the effective basal friction has to be lower than 0.08. This very low effective friction is most likely due to high pore pressure. This high pore pressure could either be a long-term property or due to dynamic effects such as thermal pressurization. The fact that landward vergence appears far from the compressional critical limit favors a dynamic effect. Landward vergence would then highlight thermal pressurization due to occasional or systematic propagation of earthquakes to the trench. As a consequence, the vergence of thrusts in accretionary prism could be used to improve seismic and tsunamigenic risk assessment.

  6. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    NASA Technical Reports Server (NTRS)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  7. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.

  8. Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data

    NASA Astrophysics Data System (ADS)

    Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.

    2017-12-01

    While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard

  9. Rupture, waves and earthquakes.

    PubMed

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  10. Rupture, waves and earthquakes

    PubMed Central

    UENISHI, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808

  11. Frictional heating processes during laboratory earthquakes

    NASA Astrophysics Data System (ADS)

    Aubry, J.; Passelegue, F. X.; Deldicque, D.; Lahfid, A.; Girault, F.; Pinquier, Y.; Escartin, J.; Schubnel, A.

    2017-12-01

    Frictional heating during seismic slip plays a crucial role in the dynamic of earthquakes because it controls fault weakening. This study proposes (i) to image frictional heating combining an in-situ carbon thermometer and Raman microspectrometric mapping, (ii) to combine these observations with fault surface roughness and heat production, (iii) to estimate the mechanical energy dissipated during laboratory earthquakes. Laboratory earthquakes were performed in a triaxial oil loading press, at 45, 90 and 180 MPa of confining pressure by using saw-cut samples of Westerly granite. Initial topography of the fault surface was +/- 30 microns. We use a carbon layer as a local temperature tracer on the fault plane and a type K thermocouple to measure temperature approximately 6mm away from the fault surface. The thermocouple measures the bulk temperature of the fault plane while the in-situ carbon thermometer images the temperature production heterogeneity at the micro-scale. Raman microspectrometry on amorphous carbon patch allowed mapping the temperature heterogeneities on the fault surface after sliding overlaid over a few micrometers to the final fault roughness. The maximum temperature achieved during laboratory earthquakes remains high for all experiments but generally increases with the confining pressure. In addition, the melted surface of fault during seismic slip increases drastically with confining pressure. While melting is systematically observed, the strength drop increases with confining pressure. These results suggest that the dynamic friction coefficient is a function of the area of the fault melted during stick-slip. Using the thermocouple, we inverted the heat dissipated during each event. We show that for rough faults under low confining pressure, less than 20% of the total mechanical work is dissipated into heat. The ratio of frictional heating vs. total mechanical work decreases with cumulated slip (i.e. number of events), and decreases with

  12. Earthquake Loss Estimates in Near Real-Time

    NASA Astrophysics Data System (ADS)

    Wyss, Max; Wang, Rongjiang; Zschau, Jochen; Xia, Ye

    2006-10-01

    The usefulness to rescue teams of nearreal-time loss estimates after major earthquakes is advancing rapidly. The difference in the quality of data available in highly developed compared with developing countries dictates that different approaches be used to maximize mitigation efforts. In developed countries, extensive information from tax and insurance records, together with accurate census figures, furnish detailed data on the fragility of buildings and on the number of people at risk. For example, these data are exploited by the method to estimate losses used in the Hazards U.S. Multi-Hazard (HAZUSMH)software program (http://www.fema.gov/plan/prevent/hazus/). However, in developing countries, the population at risk is estimated from inferior data sources and the fragility of the building stock often is derived empirically, using past disastrous earthquakes for calibration [Wyss, 2004].

  13. [Bioethics in catastrophe situations such as earthquakes].

    PubMed

    León, C Francisco Javier

    2012-01-01

    A catastrophe of the magnitude of the earthquake and tsunami that hit Chile not long ago, forces us to raise some questions that we will try to answer from a philosophical, ethical and responsibility viewpoints. An analysis of the basic principles of bioethics is also justified. A natural catastrophe is not, by itself, moral or immoral, fair or unfair. However, its consequences could certainly be regarded as such, depending on whether they could have been prevented or mitigated. We will identify those individuals, who have the ethical responsibility to attend the victims and the ethical principles that must guide the tasks of healthcare and psychological support teams. The minimal indispensable actions to obtain an adequate social and legal protection of vulnerable people, must be defined according to international guidelines. These reflections are intended to improve the responsibility of the State and all the community, to efficiently prevent and repair the material and psychological consequences of such a catastrophe.

  14. Bayesian historical earthquake relocation: an example from the 1909 Taipei earthquake

    USGS Publications Warehouse

    Minson, Sarah E.; Lee, William H.K.

    2014-01-01

    Locating earthquakes from the beginning of the modern instrumental period is complicated by the fact that there are few good-quality seismograms and what traveltimes do exist may be corrupted by both large phase-pick errors and clock errors. Here, we outline a Bayesian approach to simultaneous inference of not only the hypocentre location but also the clock errors at each station and the origin time of the earthquake. This methodology improves the solution for the source location and also provides an uncertainty analysis on all of the parameters included in the inversion. As an example, we applied this Bayesian approach to the well-studied 1909 Mw 7 Taipei earthquake. While our epicentre location and origin time for the 1909 Taipei earthquake are consistent with earlier studies, our focal depth is significantly shallower suggesting a higher seismic hazard to the populous Taipei metropolitan area than previously supposed.

  15. Earthquakes Magnitude Predication Using Artificial Neural Network in Northern Red Sea Area

    NASA Astrophysics Data System (ADS)

    Alarifi, A. S.; Alarifi, N. S.

    2009-12-01

    Earthquakes are natural hazards that do not happen very often, however they may cause huge losses in life and property. Early preparation for these hazards is a key factor to reduce their damage and consequence. Since early ages, people tried to predicate earthquakes using simple observations such as strange or a typical animal behavior. In this paper, we study data collected from existing earthquake catalogue to give better forecasting for future earthquakes. The 16000 events cover a time span of 1970 to 2009, the magnitude range from greater than 0 to less than 7.2 while the depth range from greater than 0 to less than 100km. We propose a new artificial intelligent predication system based on artificial neural network, which can be used to predicate the magnitude of future earthquakes in northern Red Sea area including the Sinai Peninsula, the Gulf of Aqaba, and the Gulf of Suez. We propose a feed forward new neural network model with multi-hidden layers to predicate earthquakes occurrences and magnitudes in northern Red Sea area. Although there are similar model that have been published before in different areas, to our best knowledge this is the first neural network model to predicate earthquake in northern Red Sea area. Furthermore, we present other forecasting methods such as moving average over different interval, normally distributed random predicator, and uniformly distributed random predicator. In addition, we present different statistical methods and data fitting such as linear, quadratic, and cubic regression. We present a details performance analyses of the proposed methods for different evaluation metrics. The results show that neural network model provides higher forecast accuracy than other proposed methods. The results show that neural network achieves an average absolute error of 2.6% while an average absolute error of 3.8%, 7.3% and 6.17% for moving average, linear regression and cubic regression, respectively. In this work, we show an analysis

  16. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography.

    PubMed

    Dong, Zhi-Hui; Yang, Zhi-Gang; Chen, Tian-Wu; Chu, Zhi-Gang; Deng, Wen; Shao, Heng

    2011-01-01

    Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; p<0.001). Among these patients, those with more than 3 fractured ribs (106/143 vs. 41/66 patients, RR=1.2; p<0.05) or flail chest (45/143 vs. 11/66 patients, RR=1.9; p<0.05) were more frequently seen in the earthquake cohort. Earthquake-related crush injuries more frequently resulted in bilateral rib fractures (66/143 vs. 18/66 patients, RR= 1.7; p<0.01). Additionally, the incidence of non-rib fracture was higher in the earthquake cohort (85 vs. 60 patients, RR= 1.4; p<0.01). Pulmonary parenchymal and pleural injuries were more frequently seen in earthquake-related crush injuries (117 vs. 80 patients, RR=1.5 for parenchymal and 146 vs. 74 patients, RR = 2.0 for pleural injuries; p<0.001). Non-rib fractures, pulmonary parenchymal and pleural injuries had significant positive correlation with rib fractures in these two cohorts. Thoracic crush traumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries.

  17. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  18. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    USGS Publications Warehouse

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents

  19. Earthquakes, May-June, 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of May and June were very active in terms of earthquake occurrence. Six major earthquakes (7.0earthquakes included a magnitude 7.1 in Papua New Guinea on May 15, a magnitude 7.1 followed by a magnitude 7.5 in the Philippine Islands on May 17, a magnitude 7.0 in the Cuba region on May 25, and a magnitude 7.3 in the Santa Cruz Islands of the Pacific on May 27. In the United States, a magnitude 7.6 earthquake struck in southern California on June 28 followed by a magnitude 6.7 quake about three hours later.

  20. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

  1. Normal Fault Type Earthquakes Off Fukushima Region - Comparison of the 1938 Events and Recent Earthquakes -

    NASA Astrophysics Data System (ADS)

    Murotani, S.; Satake, K.

    2017-12-01

    Off Fukushima region, Mjma 7.4 (event A) and 6.9 (event B) events occurred on November 6, 1938, following the thrust fault type earthquakes of Mjma 7.5 and 7.3 on the previous day. These earthquakes were estimated as normal fault earthquakes by Abe (1977, Tectonophysics). An Mjma 7.0 earthquake occurred on July 12, 2014 near event B and an Mjma 7.4 earthquake occurred on November 22, 2016 near event A. These recent events are the only M 7 class earthquakes occurred off Fukushima since 1938. Except for the two 1938 events, normal fault earthquakes have not occurred until many aftershocks of the 2011 Tohoku earthquake. We compared the observed tsunami and seismic waveforms of the 1938, 2014, and 2016 earthquakes to examine the normal fault earthquakes occurred off Fukushima region. It is difficult to compare the tsunami waveforms of the 1938, 2014 and 2016 events because there were only a few observations at the same station. The teleseismic body wave inversion of the 2016 earthquake yielded with the focal mechanism of strike 42°, dip 35°, and rake -94°. Other source parameters were as follows: source area 70 km x 40 km, average slip 0.2 m, maximum slip 1.2 m, seismic moment 2.2 x 1019 Nm, and Mw 6.8. A large slip area is located near the hypocenter, and it is compatible with the tsunami source area estimated from tsunami travel times. The 2016 tsunami source area is smaller than that of the 1938 event, consistent with the difference in Mw: 7.7 for event A estimated by Abe (1977) and 6.8 for the 2016 event. Although the 2014 epicenter is very close to that of event B, the teleseismic waveforms of the 2014 event are similar to those of event A and the 2016 event. While Abe (1977) assumed that the mechanism of event B was the same as event A, the initial motions at some stations are opposite, indicating that the focal mechanisms of events A and B are different and more detailed examination is needed. The normal fault type earthquake seems to occur following the

  2. Comparing methods for Earthquake Location

    NASA Astrophysics Data System (ADS)

    Turkaya, Semih; Bodin, Thomas; Sylvander, Matthieu; Parroucau, Pierre; Manchuel, Kevin

    2017-04-01

    There are plenty of methods available for locating small magnitude point source earthquakes. However, it is known that these different approaches produce different results. For each approach, results also depend on a number of parameters which can be separated into two main branches: (1) parameters related to observations (number and distribution of for example) and (2) parameters related to the inversion process (velocity model, weighting parameters, initial location etc.). Currently, the results obtained from most of the location methods do not systematically include quantitative uncertainties. The effect of the selected parameters on location uncertainties is also poorly known. Understanding the importance of these different parameters and their effect on uncertainties is clearly required to better constrained knowledge on fault geometry, seismotectonic processes and at the end to improve seismic hazard assessment. In this work, realized in the frame of the SINAPS@ research program (http://www.institut-seism.fr/projets/sinaps/), we analyse the effect of different parameters on earthquakes location (e.g. type of phase, max. hypocentral separation etc.). We compare several codes available (Hypo71, HypoDD, NonLinLoc etc.) and determine their strengths and weaknesses in different cases by means of synthetic tests. The work, performed for the moment on synthetic data, is planned to be applied, in a second step, on data collected by the Midi-Pyrénées Observatory (OMP).

  3. What Can Sounds Tell Us About Earthquake Interactions?

    NASA Astrophysics Data System (ADS)

    Aiken, C.; Peng, Z.

    2012-12-01

    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  4. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  5. Earthquakes in the United States

    USGS Publications Warehouse

    Stover, C.

    1977-01-01

    To supplement data in the report Preliminary Determination of Epicenters (PDE), the National earthquake Information Service (NEIS) also publishes a quarterly circular, Earthquakes in the United States. This provides information on the felt area of U.S earthquakes and their intensity. The main purpose is to describe the larger effects of these earthquakes so that they can be used in seismic risk studies, site evaluations for nuclear power plants, and answering inquiries by the general public.

  6. Medical experience of a university hospital in Turkey after the 1999 Marmara earthquake

    PubMed Central

    Bulut, M; Fedakar, R; Akkose, S; Akgoz, S; Ozguc, H; Tokyay, R

    2005-01-01

    Objectives: This study aimed to provide an overview of morbidity and mortality among patients admitted to the Hospital of the Medicine Faculty of Uludag University, Bursa, Turkey, after the 1999 Marmara earthquake. Methods: Retrospective analysis of the medical records of 645 earthquake victims. Patients' demographic data, diagnosis, dispositions, and prognosis were reviewed. Results: A total of 330 patients with earthquake related injuries and illness admitted to our hospital were included and divided into three main groups: crush syndrome (n = 110), vital organ injuries (n = 57), and non-traumatic but earthquake related illness (n = 55). Seventy seven per cent of patients were hospitalised during the first three days after the earthquake. The rate of mortality associated with the crush syndrome, vital organ injury, and non-traumatic medical problems was 21% (23/110), 17.5% (10/57), and 9% (5/55), respectively. The overall mortality rate was 8% (50/645). Conclusions: In the first 24–48 hours after a major earthquake, hospital emergency departments are flooded with large numbers of patients. Among this patient load, those patients with crush syndrome or vital organ injuries are particularly at risk. Proper triage and prompt treatment of these seriously injured earthquake victims may decrease morbidity and mortality. It is hoped that this review of the challenges met after the Marmara earthquake and the lessons learned will be of use to emergency department physicians as well as hospital emergency planners in preparing for future natural disasters. PMID:15983085

  7. Nowcasting Earthquakes and Tsunamis

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  8. Introduction to the special issue on the 2004 Parkfield earthquake and the Parkfield earthquake prediction experiment

    USGS Publications Warehouse

    Harris, R.A.; Arrowsmith, J.R.

    2006-01-01

    The 28 September 2004 M 6.0 Parkfield earthquake, a long-anticipated event on the San Andreas fault, is the world's best recorded earthquake to date, with state-of-the-art data obtained from geologic, geodetic, seismic, magnetic, and electrical field networks. This has allowed the preearthquake and postearthquake states of the San Andreas fault in this region to be analyzed in detail. Analyses of these data provide views into the San Andreas fault that show a complex geologic history, fault geometry, rheology, and response of the nearby region to the earthquake-induced ground movement. Although aspects of San Andreas fault zone behavior in the Parkfield region can be modeled simply over geological time frames, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake indicate that predicting the fine details of future earthquakes is still a challenge. Instead of a deterministic approach, forecasting future damaging behavior, such as that caused by strong ground motions, will likely continue to require probabilistic methods. However, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake have provided ample data to understand most of what did occur in 2004, culminating in significant scientific advances.

  9. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Tsuruoka, H.; Yokoi, S.

    2011-12-01

    The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

  10. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Tsuruoka, H.; Yokoi, S.

    2013-12-01

    The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

  11. Earthquakes, November-December 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were three major earthquakes (7.0-7.9) during the last two months of the year: a magntidue 7.0 on November 19 in Columbia, a magnitude 7.4 in the Kuril Islands on December 22, and a magnitude 7.1 in the South Sandwich Islands on December 27. Earthquake-related deaths were reported in Colombia, Yemen, and Iran. there were no significant earthquakes in the United States during this reporting period. 

  12. Earthquakes, September-October 1980

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    There were two major (magnitudes 7.0-7.9) earthquakes during this reporting period; a magnitude (M) 7.3 in Algeria where many people were killed or injured and extensive damage occurred, and an M=7.2 in the Loyalty Islands region of the South Pacific. Japan was struck by a damaging earthquake on September 24, killing two people and causing injuries. There were no damaging earthquakes in the United States. 

  13. Management of limb fractures in a teaching hospital: comparison between Wenchuan and Yushu earthquakes.

    PubMed

    Min, Li; Tu, Chong-qi; Liu, Lei; Zhang, Wen-li; Yi, Min; Song, Yue-ming; Huang, Fu-guo; Yang, Tian-fu; Pei, Fu-xing

    2013-01-01

    To comparatively analyze the medical records of patients with limb fractures as well as rescue strategy in Wenchuan and Yushu earthquakes so as to provide references for post-earthquake rescue. We retrospectively investigated 944 patients sustaining limb fractures, including 891 in Wenchuan earthquake and 53 in Yushu earthquake, who were admitted to West China Hospital (WCH) of Sichuan University. In Wenchuan earthquake, WCH met its three peaks of limb fracture patients influx, on post-earthquake day (PED) 2, 8 and 14 respectively. Between PED 3-14, 585 patients were transferred from WCH to other hospitals outside the Sichuan Province. In Yushu earthquake, the maximum influx of limb fracture patients happened on PED 3, and no one was shifted to other hospitals. Both in Wenchuan and Yushu earthquakes, most limb fractures were caused by blunt strike and crush/burying. In Wenchuan earthquake, there were 396 (396/942, 42.0%) open limb fractures, including 28 Gustilo I, 201 Gustilo II and 167 Gustilo III injuries. But in Yushu earthquake, the incidence of open limb fracture was much lower (6/61, 9.8%). The percent of patients with acute complications in Wenchuan earthquake (167/891, 18.7%) was much higher than that in Yushu earthquake (5/53, 3.8%). In Wenchuan earthquake rescue, 1 018 surgeries were done, composed of debridement in 376, internal fixation in 283, external fixation in 119, and vacuum sealing drainage in 117, etc. While among the 64 surgeries in Yushu earthquake rescue, the internal fixation for limb fracture was mostly adopted. All patients received proper treatment and survived except one who died due to multiple organs failure in Wenchuan earthquake. Provision of suitable and sufficient medical care in a catastrophe can only be achieved by construction of sophisticated national disaster medical system, prediction of the injury types and number of injuries, and confirmation of participating hospitals?exact role. Based on the valuable rescue experiences

  14. Retrospective stress-forecasting of earthquakes

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  15. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  16. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  17. Instability model for recurring large and great earthquakes in southern California

    USGS Publications Warehouse

    Stuart, W.D.

    1985-01-01

    The locked section of the San Andreas fault in southern California has experienced a number of large and great earthquakes in the past, and thus is expected to have more in the future. To estimate the location, time, and slip of the next few earthquakes, an earthquake instability model is formulated. The model is similar to one recently developed for moderate earthquakes on the San Andreas fault near Parkfield, California. In both models, unstable faulting (the earthquake analog) is caused by failure of all or part of a patch of brittle, strain-softening fault zone. In the present model the patch extends downward from the ground surface to about 12 km depth, and extends 500 km along strike from Parkfield to the Salton Sea. The variation of patch strength along strike is adjusted by trial until the computed sequence of instabilities matches the sequence of large and great earthquakes since a.d. 1080 reported by Sieh and others. The last earthquake was the M=8.3 Ft. Tejon event in 1857. The resulting strength variation has five contiguous sections of alternately low and high strength. From north to south, the approximate locations of the sections are: (1) Parkfield to Bitterwater Valley, (2) Bitterwater Valley to Lake Hughes, (3) Lake Hughes to San Bernardino, (4) San Bernardino to Palm Springs, and (5) Palm Springs to the Salton Sea. Sections 1, 3, and 5 have strengths between 53 and 88 bars; sections 2 and 4 have strengths between 164 and 193 bars. Patch section ends and unstable rupture ends usually coincide, although one or more adjacent patch sections may fail unstably at once. The model predicts that the next sections of the fault to slip unstably will be 1, 3, and 5; the order and dates depend on the assumed length of an earthquake rupture in about 1700. ?? 1985 Birkha??user Verlag.

  18. Rapid Estimates of Rupture Extent for Large Earthquakes Using Aftershocks

    NASA Astrophysics Data System (ADS)

    Polet, J.; Thio, H. K.; Kremer, M.

    2009-12-01

    The spatial distribution of aftershocks is closely linked to the rupture extent of the mainshock that preceded them and a rapid analysis of aftershock patterns therefore has potential for use in near real-time estimates of earthquake impact. The correlation between aftershocks and slip distribution has frequently been used to estimate the fault dimensions of large historic earthquakes for which no, or insufficient, waveform data is available. With the advent of earthquake inversions that use seismic waveforms and geodetic data to constrain the slip distribution, the study of aftershocks has recently been largely focused on enhancing our understanding of the underlying mechanisms in a broader earthquake mechanics/dynamics framework. However, in a near real-time earthquake monitoring environment, in which aftershocks of large earthquakes are routinely detected and located, these data may also be effective in determining a fast estimate of the mainshock rupture area, which would aid in the rapid assessment of the impact of the earthquake. We have analyzed a considerable number of large recent earthquakes and their aftershock sequences and have developed an effective algorithm that determines the rupture extent of a mainshock from its aftershock distribution, in a fully automatic manner. The algorithm automatically removes outliers by spatial binning, and subsequently determines the best fitting “strike” of the rupture and its length by projecting the aftershock epicenters onto a set of lines that cross the mainshock epicenter with incremental azimuths. For strike-slip or large dip-slip events, for which the surface projection of the rupture is recti-linear, the calculated strike correlates well with the strike of the fault and the corresponding length, determined from the distribution of aftershocks projected onto the line, agrees well with the rupture length. In the case of a smaller dip-slip rupture with an aspect ratio closer to 1, the procedure gives a measure

  19. Education and awareness regarding earthquakes and their consequences within the Cluj-Napoca SEISMOLAB, Romania

    NASA Astrophysics Data System (ADS)

    Brisan, Nicoleta; Stefanescu, Lucrina; Zaharia, Bogdan; Tataru, Dragos; Costin, Dan; Stefanie, Horatiu

    2014-05-01

    Education and awareness are efficient methods to mitigate the effects of natural disasters on communities. In this regard, the most receptive target group is the youth who have the potential to become vectors of information dissemination in their families and communities. In a country with significant seismic potential like Romania, the development of a Seismolab by means of an educational project is welcomed. The Seismolab operates within the Faculty of Environmental Science and Engineering at "Babeş-Bolyai" University, Cluj-Napoca, and it hosts activities conducted with the students of the faculty and pupils from Cluj and other schools involved in the RoEduSeis project. The RoEduSeis Project is a research and education project meant to develop the practical skills of primary, secondary and high school students in the field of Earth Sciences. A major objective of the project pursues the development and validation of new practical training methods for both teachers and students in the field of Earth Sciences. In this context, the Seismolab serves this particular aim by activities such as: training of students and teachers on conducting analyses and processing seismological data obtained from the educational seismographs in the Romanian educational seismic network; hands-on activities for pupils using educational resources developed through the project; documentary 2D and 3D movies and round tables on the topic of earthquakes and other natural risks. The students of the faculty use the data bases within subject matters in the curricula such as: Management of natural risks and disasters, Natural hazards and risks, Management of emergency situations etc. The seismometer used within the Seismolab will be connected to the above-mentioned educational network and the interaction between all the schools involved in the project will be conducted by the means of an e-learning platform. The results of this cooperation will contribute to a better education and awareness

  20. A moment in time: emergency nurses and the Canterbury earthquakes.

    PubMed

    Richardson, S; Ardagh, M; Grainger, P; Robinson, V

    2013-06-01

    To outline the impact of the Canterbury, New Zealand (NZ) earthquakes on Christchurch Hospital, and the experiences of emergency nurses during this time. NZ has experienced earthquakes and aftershocks centred in the Canterbury region of the South Island. The location of these, around and within the major city of Christchurch, was unexpected and associated with previously unknown fault lines. While the highest magnitude quake occurred in September 2010, registering 7.1 on the Richter scale, it was the magnitude 6.3 event on 22 February 2011 which was associated with the greatest injury burden and loss of life. Staff working in the only emergency department in the city were faced with an external emergency while also being directly affected as part of the disaster. SOURCES OF EVIDENCE: This paper developed following interviews with nurses who worked during this period, and draws on literature related to healthcare responses to earthquakes and natural disasters. The establishment of an injury database allowed for an accurate picture to emerge of the injury burden, and each of the authors was present and worked in a clinical capacity during the earthquake. Nurses played a significant role in the response to the earthquakes and its aftermath. However, little is known regarding the impact of this, either in personal or professional terms. This paper presents an overview of the earthquakes and experiences of nurses working during this time, identifying a range of issues that will benefit from further exploration and research. It seeks to provide a sense of the experiences and the potential meanings that were derived from being part of this 'moment in time'. Examples of innovations in practice emerged during the earthquake response and a number of recommendations for nursing practice are identified. © 2013 The Authors. International Nursing Review © 2013 International Council of Nurses.

  1. Testimonies to the L'Aquila earthquake (2009) and to the L'Aquila process

    NASA Astrophysics Data System (ADS)

    Kalenda, Pavel; Nemec, Vaclav

    2014-05-01

    members with manslaughter and negligence for failing to warn the public of the impending risk. Many international organizations were falsely interpreting the accusation and sentence at the first stage as a problem of impossibility to predict earthquakes. The same situation appeared when the verdict at the 1st stage was pronounced in October 2012. But this verdict is exclusively based on the personal behaviour of the sentenced persons in the course of ONE HOUR SESSION of the Great Risk Board in L'Aquila on March 31, 2009 and on the fact that two of them presented results of the session immediately to media and local population after the session. Terrible consequences of this irresponsible behavior initiated the final accusation shared by a relatively small but intellectually advanced number of families associated with victims of the earthquake. They all had a deep confidence to the top Italian seismologists who attended the meeting of the Commission. Special INGV web site founded by the "decreto INGV n.641 del 19/12/2012" asking for support letters contains the trial documentation (http://processoaquila.wordpress.com/) including the Italian version of the verdict unfortunately with incomplete or incorrect and mostly MISSING English translations.

  2. Special issue: Terrestrial fluids, earthquakes and volcanoes: The Hiroshi Wakita volume I

    USGS Publications Warehouse

    Perez, Nemesio M.; King, Chi-Yu; Gurrieri, Sergio; McGee, Kenneth A.

    2006-01-01

    Terrestrial Fluids, Earthquakes and Volcanoes: The Hiroshi Wakita Volume I is a special publication to honor Professor Hiroshi Wakita for his scientific contributions. This volume consists of 17 original papers dealing with various aspects of the role of terrestrial fluids in earthquake and volcanic processes, which reflect Prof. Wakita’s wide scope of research interests.Professor Wakita co-founded the Laboratory for Earthquake Chemistry in 1978 and served as its director from 1988 until his retirement from the university in 1997. He has made the laboratory a leading world center for studying earthquakes and volcanic activities by means of geochemical and hydrological methods. Together with his research team and a number of foreign guest researchers that he attracted, he has made many significant contributions in the above-mentioned scientific fields of interest. This achievement is a testimony to not only his scientific talent, but also his enthusiasm, his open mindedness, and his drive in obtaining both human and financial support.

  3. Focal mechanisms of earthquakes in Mongolia

    NASA Astrophysics Data System (ADS)

    Sodnomsambuu, D.; Natalia, R.; Gangaadorj, B.; Munkhuu, U.; Davaasuren, G.; Danzansan, E.; Yan, R.; Valentina, M.; Battsetseg, B.

    2011-12-01

    Focal mechanism data provide information on the relative magnitudes of the principal stresses, so that a tectonic regime can be assigned. Especially such information is useful for the study of intraplate seismic active regions. A study of earthquake focal mechanisms in the territory of Mongolia as landlocked and intraplate region was conducted. We present map of focal mechanisms of earthquakes with M4.5 which occurred in Mongolia and neighboring regions. Focal mechanisms solutions were constrained by the first motion solutions, as well as by waveform modeling, particularly CMT solutions. Four earthquakes have been recorded in Mongolia in XX century with magnitude more than 8, the 1905 M7.9 Tsetserleg and M8.4 Bolnai earthquakes, the 1931 M8.0 Fu Yun earthquake, the 1957 M8.1 Gobi-Altai earthquake. However the map of focal mechanisms of earthquakes in Mongolia allows seeing all seismic active structures: Gobi Altay, Mongolian Altay, active fringe of Hangay dome, Hentii range etc. Earthquakes in the most of Mongolian territory and neighboring China regions are characterized by strike-slip and reverse movements. Strike-slip movements also are typical for earthquakes in Altay Range in Russia. The north of Mongolia and south part of the Baikal area is a region where have been occurred earthquakes with different focal mechanisms. This region is a zone of the transition between compressive regime associated to India-Eurasian collision and extensive structures localized in north of the country as Huvsgul area and Baykal rift. Earthquakes in the Baikal basin itself are characterized by normal movements. Earthquakes in Trans-Baikal zone and NW of Mongolia are characterized dominantly by strike-slip movements. Analysis of stress-axis orientations, the tectonic stress tensor is presented. The map of focal mechanisms of earthquakes in Mongolia could be useful tool for researchers in their study on Geodynamics of Central Asia, particularly of Mongolian and Baikal regions.

  4. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  5. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  6. Near-Field Deformation Associated with the South Napa Earthquake (M 6.0) Using Differential Airborne LiDAR

    NASA Astrophysics Data System (ADS)

    Hudnut, K. W.; Glennie, C. L.; Brooks, B. A.; Hauser, D. L.; Ericksen, T.; Boatwright, J.; Rosinski, A.; Dawson, T. E.; Mccrink, T. P.; Mardock, D. K.; Hoirup, D. F., Jr.; Bray, J.

    2014-12-01

    Pre-earthquake airborne LiDAR coverage exists for the area impacted by the M 6.0 South Napa earthquake. The Napa watershed data set was acquired in 2003, and data sets were acquired in other portions of the impacted area in 2007, 2010 and 2014. The pre-earthquake data are being assessed and are of variable quality and point density. Following the earthquake, a coalition was formed to enable rapid acquisition of post-earthquake LiDAR. Coordination of this coalition took place through the California Earthquake Clearinghouse; consequently, a commercial contract was organized by Department of Water Resources that allowed for the main fault rupture and damaged Browns Valley area to be covered 16 days after the earthquake at a density of 20 points per square meter over a 20 square kilometer area. Along with the airborne LiDAR, aerial imagery was acquired and will be processed to form an orthomosaic using the LiDAR-derived DEM. The 'Phase I' airborne data were acquired using an Optech Orion M300 scanner, an Applanix 200 GPS-IMU, and a DiMac ultralight medium format camera by Towill. These new data, once delivered, will be differenced against the pre-earthquake data sets using a newly developed algorithm for point cloud matching, which is improved over prior methods by accounting for scan geometry error sources. Proposed additional 'Phase II' coverage would allow repeat-pass, post-earthquake coverage of the same area of interest as in Phase I, as well as an addition of up to 4,150 square kilometers that would potentially allow for differential LiDAR assessment of levee and bridge impacts at a greater distance from the earthquake source. Levee damage was reported up to 30 km away from the epicenter, and proposed LiDAR coverage would extend up to 50 km away and cover important critical lifeline infrastructure in the western Sacramento River delta, as well as providing full post-earthquake repeat-pass coverage of the Napa watershed to study transient deformation.

  7. Earthquake predictions using seismic velocity ratios

    USGS Publications Warehouse

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  8. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  9. Smoking prevalence increases following Canterbury earthquakes.

    PubMed

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  10. Defining "Acceptable Risk" for Earthquakes Worldwide

    NASA Astrophysics Data System (ADS)

    Tucker, B.

    2001-05-01

    The greatest and most rapidly growing earthquake risk for mortality is in developing countries. Further, earthquake risk management actions of the last 50 years have reduced the average lethality of earthquakes in earthquake-threatened industrialized countries. (This is separate from the trend of the increasing fiscal cost of earthquakes there.) Despite these clear trends, every new earthquake in developing countries is described in the media as a "wake up" call, announcing the risk these countries face. GeoHazards International (GHI) works at both the community and the policy levels to try to reduce earthquake risk. GHI reduces death and injury by helping vulnerable communities recognize their risk and the methods to manage it, by raising awareness of its risk, building local institutions to manage that risk, and strengthening schools to protect and train the community's future generations. At the policy level, GHI, in collaboration with research partners, is examining whether "acceptance" of these large risks by people in these countries and by international aid and development organizations explains the lack of activity in reducing these risks. The goal of this pilot project - The Global Earthquake Safety Initiative (GESI) - is to develop and evaluate a means of measuring the risk and the effectiveness of risk mitigation actions in the world's largest, most vulnerable cities: in short, to develop an earthquake risk index. One application of this index is to compare the risk and the risk mitigation effort of "comparable" cities. By this means, Lima, for example, can compare the risk of its citizens dying due to earthquakes with the risk of citizens in Santiago and Guayaquil. The authorities of Delhi and Islamabad can compare the relative risk from earthquakes of their school children. This index can be used to measure the effectiveness of alternate mitigation projects, to set goals for mitigation projects, and to plot progress meeting those goals. The preliminary

  11. Minimum of the order parameter fluctuations of seismicity before major earthquakes in Japan.

    PubMed

    Sarlis, Nicholas V; Skordas, Efthimios S; Varotsos, Panayiotis A; Nagao, Toshiyasu; Kamogawa, Masashi; Tanaka, Haruo; Uyeda, Seiya

    2013-08-20

    It has been shown that some dynamic features hidden in the time series of complex systems can be uncovered if we analyze them in a time domain called natural time χ. The order parameter of seismicity introduced in this time domain is the variance of χ weighted for normalized energy of each earthquake. Here, we analyze the Japan seismic catalog in natural time from January 1, 1984 to March 11, 2011, the day of the M9 Tohoku earthquake, by considering a sliding natural time window of fixed length comprised of the number of events that would occur in a few months. We find that the fluctuations of the order parameter of seismicity exhibit distinct minima a few months before all of the shallow earthquakes of magnitude 7.6 or larger that occurred during this 27-y period in the Japanese area. Among the minima, the minimum before the M9 Tohoku earthquake was the deepest. It appears that there are two kinds of minima, namely precursory and nonprecursory, to large earthquakes.

  12. Cyclic migration of weak earthquakes between Lunigiana earthquake of October 10, 1995 and Reggio Emilia earthquake of October 15, 1996 (Northern Italy)

    NASA Astrophysics Data System (ADS)

    di Giovambattista, R.; Tyupkin, Yu

    The cyclic migration of weak earthquakes (M 2.2) which occurred during the yearprior to the October 15, 1996 (M = 4.9) Reggio Emilia earthquake isdiscussed in this paper. The onset of this migration was associated with theoccurrence of the October 10, 1995 (M = 4.8) Lunigiana earthquakeabout 90 km southwest from the epicenter of the Reggio Emiliaearthquake. At least three series of earthquakes migrating from theepicentral area of the Lunigiana earthquake in the northeast direction wereobserved. The migration of earthquakes of the first series terminated at adistance of about 30 km from the epicenter of the Reggio Emiliaearthquake. The earthquake migration of the other two series halted atabout 10 km from the Reggio Emilia epicenter. The average rate ofearthquake migration was about 200-300 km/year, while the time ofrecurrence of the observed cycles varied from 68 to 178 days. Weakearthquakes migrated along the transversal fault zones and sometimesjumped from one fault to another. A correlation between the migratingearthquakes and tidal variations is analysed. We discuss the hypothesis thatthe analyzed area is in a state of stress approaching the limit of thelong-term durability of crustal rocks and that the observed cyclic migrationis a result of a combination of a more or less regular evolution of tectonicand tidal variations.

  13. Re-examination of the original questionnaire documents for the 1944 Tonankai, 1945 Mikawa, and 1946 Nanaki earthquakes

    NASA Astrophysics Data System (ADS)

    Harada, Tomoya; Satake, Kenji; Furumura, Takashi

    2016-04-01

    With the object of estimating seismic intensity, the Earthquakes Research Institute (ERI) of the University of Tokyo performed questionnaire surveys for the significant (destructive or large/great) earthquakes from 1943 to 1988 (Kayano, 1990, BERI). In these surveys, Kawasumi (1943)'s 12-class seismic intensity scale similar to the Modified Mercalli scale (MM-scale) was used. Survey results for earthquakes after 1950 were well investigated and published (e.g. Kayano and Komaki, 1977, BERI; Kayano and Sato, 1975, BERI), but the survey results for earthquakes in the 1940s have not been published and original documents of the surveys was missing. Recently, the original sheets of the surveys for the five earthquakes in the 1940s with more than 1,000 casualties were discovered in the ERI warehouse, although they are incomplete (Tsumura et al, 2010). They are from the 1943 Tottori (M 7.2), 1944 Tonankai (M 7.9), 1945 Mikawa (M 6.8), 1946 Nankai (M 8.0), and 1948 Fukui (M 7.1) earthquakes. In this study, we examined original questionnaire and summary sheets for the 1944 Tonankai, 1945 Mikawa, and 1946 Nanaki earthquakes, and estimated the distributions of seismic intensity, various kinds of damage, and human behaviors in detail. Numbers of the survey points for the 1944, 1945, and 1946 event are 287, 145, and 1,014, respectively. The numbers for the 1944 and 1945 earthquakes are much fewer than that of the 1946 event, because they occurred during the last years of World War II. The 1944 seismic intensities in the prefectures near the source region (Aichi, Mie, Shizuoka, and Gifu Pref.) tend to be high. However, the 1944 intensities are also high and damage is serious at the Suwa Lake shore in Nagano Pref. which is about 240 km far from the source region because seismic waves are amplified dramatically in the thick sediment in the Suwa Basin. Seismic intensities of the 1945 Mikawa earthquake near the source region in Aichi Pref. were very high (X-XI). However, the

  14. Post-Traumatic Stress Disorder and other mental disorders in the general population after Lorca’s earthquakes, 2011 (Murcia, Spain): A cross-sectional study

    PubMed Central

    Salmerón, Diego; Vilagut, Gemma; Tormo, Mª José; Ruíz-Merino, Guadalupe; Escámez, Teresa; Júdez, Javier; Martínez, Salvador; Koenen, Karestan C.; Navarro, Carmen; Alonso, Jordi; Kessler, Ronald C.

    2017-01-01

    Aims To describe the prevalence and severity of mental disorders and to examine differences in risk among those with and without a lifetime history prior to a moderate magnitude earthquake that took place in Lorca (Murcia, Spain) at roughly the mid-point (on May 11, 2011) of the time interval in which a regional epidemiological survey was already being carried out (June 2010 –May 2012). Methods The PEGASUS-Murcia project is a cross-sectional face-to-face interview survey of a representative sample of non-institutionalized adults in Murcia. Main outcome measures are prevalence and severity of anxiety, mood, impulse and substance disorders in the 12 months previous to the survey, assessed using the Composite International Diagnostic Interview (CIDI 3.0). Sociodemographic variables, prior history of any mental disorder and earthquake-related stressors were entered as independent variables in a logistic regression analysis. Findings A total number of 412 participants (response rate: 71%) were interviewed. Significant differences in 12-month prevalence of mental disorders were found in Lorca compared to the rest of Murcia for any (12.8% vs 16.8%), PTSD (3.6% vs 0.5%) and other anxiety disorders (5.3% vs 9.2%) (p≤ 0.05 for all). No differences were found for 12-month prevalence of any mood or any substance disorder. The two major predictors for developing a 12-month post-earthquake mental disorder were a prior mental disorder and the level of exposure. Other risk factors included female sex and low-average income. Conclusions PTSD and other mental disorders are commonly associated with earthquake disasters. Prior mental disorders and the level of exposure to the earthquakes are the most important for the development of a consequent mental disorder and this recognition may help to identify those individuals that may most benefit from specific therapeutic intervention. PMID:28723949

  15. Quantified sensitivity of lakes to record historic earthquakes: Implications for paleoseismology

    NASA Astrophysics Data System (ADS)

    Wilhelm, Bruno; Nomade, Jerome; Crouzet, Christian; Litty, Camille; Belle, Simon; Rolland, Yann; Revel, Marie; Courboulex, Françoise; Arnaud, Fabien; Anselmetti, Flavio S.

    2015-04-01

    Seismic hazard assessment is a challenging issue for modern societies. A key parameter to be estimated is the recurrence interval of damaging earthquakes. In moderately active seismo-tectonic regions, this requires the establishment of earthquake records long enough to be relevant, i.e. far longer than historical observations. Here we investigate how lake sediments can be used for this purpose and quantify the conditions that enable earthquake recording. For this purpose, (i) we studied nine lake-sediment sequences to reconstruct mass-movement chronicles in different settings of the French Alpine range and (ii) we compared the chronicles to the well-documented earthquake history over the last five centuries. The studied lakes are all small alpine-type lakes based directly on bedrock. All lake sequences have been studied following the same methodology; (i) a multi-core approach to well understand the sedimentary processes within the lake basins, (ii) a high-resolution lithological and grain-size characterization and (iii) a dating based on short-lived radionuclide measurements, lead contaminations and radiocarbon ages. We identified 40 deposits related to 26 mass-movement (MM) occurrences. 46% (12 on 26) of the MMs are synchronous in neighbouring lakes, supporting strongly an earthquake origin. In addition, the good agreement between MMs ages and historical earthquake dates suggests an earthquake trigger for 88% (23 on 26) of them. Related epicenters are always located at distances of less than 100 km from the lakes and their epicentral MSK intensity ranges between VII and IX. However, the number of earthquake-triggered MMs varies between lakes of a same region, suggesting a gradual sensitivity of the lake sequences towards earthquake shaking, i.e. distinct lake-sediment slope stabilities. The quantification of this earthquake sensitivity and the comparison to the lake system and sediment characteristics suggest that the primary factor explaining this variability is

  16. Application and analysis of debris-flow early warning system in Wenchuan earthquake-affected area

    NASA Astrophysics Data System (ADS)

    Liu, D. L.; Zhang, S. J.; Yang, H. J.; Zhao, L. Q.; Jiang, Y. H.; Tang, D.; Leng, X. P.

    2016-02-01

    The activities of debris flow (DF) in the Wenchuan earthquake-affected area significantly increased after the earthquake on 12 May 2008. The safety of the lives and property of local people is threatened by DFs. A physics-based early warning system (EWS) for DF forecasting was developed and applied in this earthquake area. This paper introduces an application of the system in the Wenchuan earthquake-affected area and analyzes the prediction results via a comparison to the DF events triggered by the strong rainfall events reported by the local government. The prediction accuracy and efficiency was first compared with a contribution-factor-based system currently used by the weather bureau of Sichuan province. The storm on 17 August 2012 was used as a case study for this comparison. The comparison shows that the false negative rate and false positive rate of the new system is, respectively, 19 and 21 % lower than the system based on the contribution factors. Consequently, the prediction accuracy is obviously higher than the system based on the contribution factors with a higher operational efficiency. On the invitation of the weather bureau of Sichuan province, the authors upgraded their prediction system of DF by using this new system before the monsoon of Wenchuan earthquake-affected area in 2013. Two prediction cases on 9 July 2013 and 10 July 2014 were chosen to further demonstrate that the new EWS has high stability, efficiency, and prediction accuracy.

  17. Urban Policies and Earthquake Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Sarlo, Antonella

    2008-07-01

    The paper aims at proposing some considerations about some recent experiences of research carried out on the theme of earthquake risk mitigation and combining policies and actions of mitigation with urban development strategies. The objective was to go beyond the classical methodological approach aiming at defining a "technical" evaluation of the earthquake risk through a procedure which can correlate the three "components" of danger, exposure and vulnerability. These researches experiment, in terms of methodology and application, with a new category of interpretation and strategy: the so-called Struttura Urbana Minima (Minimum urban structure). Actually, the introduction of the Struttura Urbana Minima establishes a different approach towards the theme of safety in the field of earthquake risk, since it leads to a wider viewpoint, combining the building aspect of the issue with the purely urban one, involving not only town planning, but also social and managerial implications. In this sense the constituent logic of these researches is strengthened by two fundamental issues: - The social awareness of earthquake; - The inclusion of mitigation policies in the ordinary strategies for town and territory management. Three main aspects of the first point, that is of the "social awareness of earthquake", characterize this issue and demand to be considered within a prevention policy: - The central role of the risk as a social production, - The central role of the local community consent, - The central role of the local community capability to plan Therefore, consent, considered not only as acceptance, but above all as participation in the elaboration and implementation of choices, plays a crucial role in the wider issue of prevention policies. As far as the second point is concerned, the inclusion of preventive mitigation policies in ordinary strategies for the town and territory management demands the identification of criteria of choice and priorities of intervention and

  18. A Sensitivity Analysis of Tsunami Inversions on the Number of Stations

    NASA Astrophysics Data System (ADS)

    An, Chao; Liu, Philip L.-F.; Meng, Lingsen

    2018-05-01

    Current finite-fault inversions of tsunami recordings generally adopt as many tsunami stations as possible to better constrain earthquake source parameters. In this study, inversions are evaluated by the waveform residual that measures the difference between model predictions and recordings, and the dependence of the quality of inversions on the number tsunami stations is derived. Results for the 2011 Tohoku event show that, if the tsunami stations are optimally located, the waveform residual decreases significantly with the number of stations when the number is 1 ˜ 4 and remains almost constant when the number is larger than 4, indicating that 2 ˜ 4 stations are able to recover the main characteristics of the earthquake source. The optimal location of tsunami stations is explained in the text. Similar analysis is applied to the Manila Trench in the South China Sea using artificially generated earthquakes and virtual tsunami stations. Results confirm that 2 ˜ 4 stations are necessary and sufficient to constrain the earthquake source parameters, and the optimal sites of stations are recommended in the text. The conclusion is useful for the design of new tsunami warning systems. Current strategies of tsunameter network design mainly focus on the early detection of tsunami waves from potential sources to coastal regions. We therefore recommend that, in addition to the current strategies, the waveform residual could also be taken into consideration so as to minimize the error of tsunami wave prediction for warning purposes.

  19. Liquefaction-induced lateral spreading in Oceano, California, during the 2003 San Simeon Earthquake

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.; Di Alessandro, Carola; Boatwright, John; Tinsley, John C.; Sell, Russell W.; Rosenberg, Lewis I.

    2004-01-01

    the 2003 damage was caused by lateral spreading in two separate areas, one near Norswing Drive and the other near Juanita Avenue. The areas coincided with areas with the highest liquefaction potential found in Oceano. Areas with site amplification conditions similar to those in Oceano are particularly vulnerable to earthquakes. Site amplification may cause shaking from distant earthquakes, which normally would not cause damage, to increase locally to damaging levels. The vulnerability in Oceano is compounded by the widespread distribution of highly liquefiable soils that will reliquefy when ground shaking is amplified as it was during the San Simeon earthquake. The experience in Oceano can be expected to repeat because the region has many active faults capable of generating large earthquakes. In addition, liquefaction and lateral spreading will be more extensive for moderate-size earthquakes that are closer to Oceano than was the 2003 San Simeon earthquake. Site amplification and liquefaction can be mitigated. Shaking is typically mitigated in California by adopting and enforcing up-to-date building codes. Although not a guarantee of safety, application of these codes ensures that the best practice is used in construction. Building codes, however, do not always require the upgrading of older structures to new code requirements. Consequently, many older structures may not be as resistant to earthquake shaking as new ones. For older structures, retrofitting is required to bring them up to code. Seismic provisions in codes also generally do not apply to nonstructural elements such as drywall, heating systems, and shelving. Frequently, nonstructural damage dominates the earthquake loss. Mitigation of potential liquefaction in Oceano presently is voluntary for existing buildings, but required by San Luis Obispo County for new construction. Multiple mitigation procedures are available to individual property owners. These procedures typically involve either

  20. Earthquake-triggered landslides along the Hyblean-Malta Escarpment (off Augusta, eastern Sicily, Italy) - assessment of the related tsunamigenic potential

    NASA Astrophysics Data System (ADS)

    Ausilia Paparo, Maria; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Tinti, Stefano

    2017-02-01

    Eastern Sicily is affected by earthquakes and tsunamis of local and remote origin, which is known through numerous historical chronicles. Recent studies have put emphasis on the role of submarine landslides as the direct cause of the main local tsunamis, envisaging that earthquakes (in 1693 and 1908) did produce a tsunami, but also that they triggered mass failures that were able to generate an even larger tsunami. The debate is still open, and though no general consensus has been found among scientists so far, this research had the merit to attract attention on possible generation of tsunamis by landslides off Sicily. In this paper we investigate the tsunami potential of mass failures along one sector of the Hyblean-Malta Escarpment (HME). facing Augusta. The HME is the main offshore geological structure of the region running almost parallel to the coast, off eastern Sicily. Here, bottom morphology and slope steepness favour soil failures. In our work we study slope stability under seismic load along a number of HME transects by using the Minimun Lithostatic Deviation (MLD) method, which is based on the limit-equilibrium theory. The main goal is to identify sectors of the HME that could be unstable under the effect of realistic earthquakes. We estimate the possible landslide volume and use it as input for numerical codes to simulate the landslide motion and the consequent tsunami. This is an important step for the assessment of the tsunami hazard in eastern Sicily and for local tsunami mitigation policies. It is also important in view of tsunami warning system since it can help to identify the minimum earthquake magnitude capable of triggering destructive tsunamis induced by landslides, and therefore to set up appropriate knowledge-based criteria to launch alert to the population.

  1. Recurrent slow slip event likely hastened by the 2011 Tohoku earthquake

    PubMed Central

    Hirose, Hitoshi; Kimura, Hisanori; Enescu, Bogdan; Aoi, Shin

    2012-01-01

    Slow slip events (SSEs) are another mode of fault deformation than the fast faulting of regular earthquakes. Such transient episodes have been observed at plate boundaries in a number of subduction zones around the globe. The SSEs near the Boso Peninsula, central Japan, are among the most documented SSEs, with the longest repeating history, of almost 30 y, and have a recurrence interval of 5 to 7 y. A remarkable characteristic of the slow slip episodes is the accompanying earthquake swarm activity. Our stable, long-term seismic observations enable us to detect SSEs using the recorded earthquake catalog, by considering an earthquake swarm as a proxy for a slow slip episode. Six recurrent episodes are identified in this way since 1982. The average duration of the SSE interoccurrence interval is 68 mo; however, there are significant fluctuations from this mean. While a regular cycle can be explained using a simple physical model, the mechanisms that are responsible for the observed fluctuations are poorly known. Here we show that the latest SSE in the Boso Peninsula was likely hastened by the stress transfer from the March 11, 2011 great Tohoku earthquake. Moreover, a similar mechanism accounts for the delay of an SSE in 1990 by a nearby earthquake. The low stress buildups and drops during the SSE cycle can explain the strong sensitivity of these SSEs to stress transfer from external sources. PMID:22949688

  2. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    PubMed Central

    Dong, Zhi-hui; Yang, Zhi-gang; Chen, Tian-wu; Chu, Zhi-gang; Deng, Wen; Shao, Heng

    2011-01-01

    PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; p<0.001). Among these patients, those with more than 3 fractured ribs (106/143 vs. 41/66 patients, RR = 1.2; p<0.05) or flail chest (45/143 vs. 11/66 patients, RR = 1.9; p<0.05) were more frequently seen in the earthquake cohort. Earthquake-related crush injuries more frequently resulted in bilateral rib fractures (66/143 vs. 18/66 patients, RR = 1.7; p<0.01). Additionally, the incidence of non-rib fracture was higher in the earthquake cohort (85 vs. 60 patients, RR = 1.4; p<0.01). Pulmonary parenchymal and pleural injuries were more frequently seen in earthquake-related crush injuries (117 vs. 80 patients, RR = 1.5 for parenchymal and 146 vs. 74 patients, RR = 2.0 for pleural injuries; p<0.001). Non-rib fractures, pulmonary parenchymal and pleural injuries had significant positive correlation with rib fractures in these two cohorts. CONCLUSIONS: Thoracic crush traumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries. PMID:21789386

  3. A New Correlation of Large Earthquakes Along the Southern San Andreas Fault

    NASA Astrophysics Data System (ADS)

    Scharer, K. M.; Weldon, R. J.; Biasi, G. P.

    2010-12-01

    There are now three sites on the southern San Andreas fault (SSAF) with records of 10 or more dated ground rupturing earthquakes (Frazier Mountain, Wrightwood and Pallett Creek) and at least seven other sites with 3-5 dated events. Numerous sites have related information including geomorphic offsets caused by 1 to a few earthquakes, a known amount of slip spanning a specific interval of time or number of earthquakes, or the number (but not necessarily the exact ages) of earthquakes in an interval of time. We use this information to construct a record of recent large earthquakes on the SSAF. Strongly overlapping C-14 age ranges, especially between closely spaced sites like Pallett Creek and Wrightwood on the Mojave segment and Thousand Palms, Indio, Coachella and Salt Creek on the southernmost 100 kms of the fault, and overlap between the more distant Frazier Mountain and Bidart Fan sites on the northernmost part of the fault suggest that the paleoseismic data are robust and can be explained by a relatively small number of events that span substantial portions of the fault. This is consistent with the extent of rupture of the two historic events (1857 was ~300 km long and 1812 was 100-200 km long); slip per event data that averages 3-5 m per event at most sites; and the long historical hiatus since 1857. While some sites have smaller offsets for individual events, correlation between sites suggests that many small offsets are near the end of long ruptures. While the long event series on the Mojave are quasi-periodic, individual intervals range about an order of magnitude, from a few decades up to ~200 years. This wide range of intervals and the apparent anti-slip predictable behavior of ruptures (small intervals are not followed by small events) suggest weak clustering or periods of time spanning multiple intervals when strain release is higher low lower than average. These properties defy the application of simple hazard analysis but need to be understood to

  4. The 2012 Emilia (Northern Italy) earthquake sequence: an attempt of historical reading

    NASA Astrophysics Data System (ADS)

    Graziani, L.; Bernardini, F.; Castellano, C.; Del Mese, S.; Ercolani, E.; Rossi, A.; Tertulliani, A.; Vecchi, M.

    2015-04-01

    In May-June 2012, the Po Valley (Northern Italy) was struck by an earthquake sequence whose strongest event occurred on 20 May (Mw 5.9). The intensity values (Imax 7-8 EMS98) assessed through macroseismic field surveys seemed inappropriate to describe the whole range of effects observed, especially those to monumental heritage, which suffered very heavy damage and destruction. The observed intensities in fact were significantly lower than those we could have expected after a Mw 5.9 event for Italy. As magnitude-intensity regressions are mainly based on historical earthquake data, we handle this issue going back in time and debating the following hypotheses: (a) the 2012 Emilia earthquake sequence shows lower intensity values than expected because the affected urban context is more heterogeneous and much less vulnerable than that in the past; (b) some historical earthquakes, especially those that occurred centuries ago and are provided with little information, could show a tendency to be overestimated in intensity, and consequently in magnitude. In order to give consistency to such hypotheses, we have introduced, as a test, a dual historical reading of the 2012 Emilia earthquake sequence as if it had occurred in the past: the first reading refers to a period prior to the introduction of concrete in buildings assessing the intensity on traditional masonry buildings only. A further historical reading, assessed by using information on monumental buildings only, was performed, and it can be roughly referred to the XVI-XVII centuries. In both cases, intensity values tend to grow significantly. The results could have a relevant impact when considered for seismic hazard assessments if confirmed on a large scale.

  5. Estimating the Maximum Magnitude of Induced Earthquakes With Dynamic Rupture Simulations

    NASA Astrophysics Data System (ADS)

    Gilmour, E.; Daub, E. G.

    2017-12-01

    Seismicity in Oklahoma has been sharply increasing as the result of wastewater injection. The earthquakes, thought to be induced from changes in pore pressure due to fluid injection, nucleate along existing faults. Induced earthquakes currently dominate central and eastern United States seismicity (Keranen et al. 2016). Induced earthquakes have only been occurring in the central US for a short time; therefore, too few induced earthquakes have been observed in this region to know their maximum magnitude. The lack of knowledge regarding the maximum magnitude of induced earthquakes means that large uncertainties exist in the seismic hazard for the central United States. While induced earthquakes follow the Gutenberg-Richter relation (van der Elst et al. 2016), it is unclear if there are limits to their magnitudes. An estimate of the maximum magnitude of the induced earthquakes is crucial for understanding their impact on seismic hazard. While other estimates of the maximum magnitude exist, those estimates are observational or statistical, and cannot take into account the possibility of larger events that have not yet been observed. Here, we take a physical approach to studying the maximum magnitude based on dynamic ruptures simulations. We run a suite of two-dimensional ruptures simulations to physically determine how ruptures propagate. The simulations use the known parameters of principle stress orientation and rupture locations. We vary the other unknown parameters of the ruptures simulations to obtain a large number of rupture simulation results reflecting different possible sets of parameters, and use these results to train a neural network to complete the ruptures simulations. Then using a Markov Chain Monte Carlo method to check different combinations of parameters, the trained neural network is used to create synthetic magnitude-frequency distributions to compare to the real earthquake catalog. This method allows us to find sets of parameters that are

  6. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  7. SHERPA: Towards better accessibility of earthquake rupture archives

    NASA Astrophysics Data System (ADS)

    Théo, Yann; Sémo, Emmanuel; Mazet Roux, Gilles; Bossu, Rémy; Kamb, Linus; Frobert, Laurent

    2010-05-01

    Large crustal earthquakes are the subject of extensive field surveys in order to better understand the rupture process and its tectonic consequences. After the earthquake, pictures of the rupture can easily viewed quite easily on the web. However, once the event gets old, pictures disappear and can no longer be viewed, a heavy loss for researchers looking for information. Even when available, there are linked to a given survey and comparison between different earthquakes of the same phenomenon can not be easily performed. SHERPA, Sharing of Earthquake Rupture Pictures Archive, a web application developed at EMSC aims to fill this void. It aims at making available pictures of past earthquakes and sharing resources while strictly protecting the authors copyright and keeping the authors in charge of the diffusion to avoid unfair or inappropriate use of the photos. Our application is targeted at scientists and scientists only. Pictures uploaded on SHERPA are marked by a watermark "NOT FOR PUBLICATION" spread all over, and state the author's name. Authors and authors only have the possibility to remove this mark should they want their work to enter the public domain. If a user sees a picture he/she would like to use, he/she can put this picture in his/her cart. After the validation of this cart, a request (stating the name and purposes of the requestor) will be sent to the author(s) to ask to share the picture(s). If an author accepts this request, the requestor will be given the authorization to access a protected folder and download the unmarked picture. Without the author explicit consent, no picture will never be accessible to anyone. We want to state this point very clearly because ownership and copyright protection are essential to the SHERPA project. Uploading pictures is quick and easy: once registered, you can very simply upload pictures that can then be geolocalised using a Google map plugged on the web site. If the camera is equipped with a GPS, the software

  8. Sense of Community and Depressive Symptoms among Older Earthquake Survivors Following the 2008 Earthquake in Chengdu China

    ERIC Educational Resources Information Center

    Li, Yawen; Sun, Fei; He, Xusong; Chan, Kin Sun

    2011-01-01

    This study examined the impact of an earthquake as well as the role of sense of community as a protective factor against depressive symptoms among older Chinese adults who survived an 8.0 magnitude earthquake in 2008. A household survey of a random sample was conducted 3 months after the earthquake and 298 older earthquake survivors participated…

  9. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  10. Energy Partition and Variability of Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, H.

    2003-12-01

    During an earthquake the potential energy (strain energy + gravitational energy + rotational energy) is released, and the released potential energy (Δ W) is partitioned into radiated energy (ER), fracture energy (EG), and thermal energy (E H). How Δ W is partitioned into these energies controls the behavior of an earthquake. The merit of the slip-weakening concept is that only ER and EG control the dynamics, and EH can be treated separately to discuss the thermal characteristics of an earthquake. In general, if EG/E_R is small, the event is ``brittle", if EG /ER is large, the event is ``quasi static" or, in more common terms, ``slow earthquakes" or ``creep". If EH is very large, the event may well be called a thermal runaway rather than an earthquake. The difference in energy partition has important implications for the rupture initiation, evolution and excitation of long-period ground motions from very large earthquakes. We review the current state of knowledge on this problem in light of seismological observations and the basic physics of fracture. With seismological methods, we can measure only ER and the lower-bound of Δ W, Δ W0, and estimation of other energies involves many assumptions. ER: Although ER can be directly measured from the radiated waves, its determination is difficult because a large fraction of energy radiated at the source is attenuated during propagation. With the commonly used teleseismic and regional methods, only for events with MW>7 and MW>4, respectively, we can directly measure more than 10% of the total radiated energy. The rest must be estimated after correction for attenuation. Thus, large uncertainties are involved, especially for small earthquakes. Δ W0: To estimate Δ W0, estimation of the source dimension is required. Again, only for large earthquakes, the source dimension can be estimated reliably. With the source dimension, the static stress drop, Δ σ S, and Δ W0, can be estimated. EG: Seismologically, EG is the energy

  11. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Trugman, Daniel Taylor

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical features of the historical seismic record, they are inconsistent with the complexity underlying earthquake occurrence and can lead to misleading assessments of seismic hazard when applied in practice. The six principle chapters of this thesis explore the extent to which the behavior of real earthquakes deviates from these simplified models, and the implications that the observed deviations have for our understanding of earthquake rupture processes and seismic hazard. Chapter 1 provides a brief thematic overview and introduction to the scope of this thesis. Chapter 2 examines the complexity of the 2010 M7.2 El Mayor-Cucapah earthquake, focusing on the relation between its unexpected and unprecedented occurrence and anthropogenic stresses from the nearby Cerro Prieto Geothermal Field. Chapter 3 compares long-term changes in seismicity within California's three largest geothermal fields in an effort to characterize the relative influence of natural and anthropogenic stress transients on local seismic hazard. Chapter 4 describes a hybrid, hierarchical clustering algorithm that can be used to relocate earthquakes using waveform cross-correlation, and applies the new algorithm to study the spatiotemporal evolution of two recent seismic swarms in western Nevada. Chapter 5 describes a new spectral decomposition technique that can be used to analyze the dynamic source properties of large datasets of earthquakes, and applies this approach to revisit the question of self-similar scaling of

  12. An integrated analysis on source parameters, seismogenic structure and seismic hazard of the 2014 Ms 6.3 Kangding earthquake

    NASA Astrophysics Data System (ADS)

    Zheng, Y.

    2016-12-01

    On November 22, 2014, the Ms6.3 Kangding earthquake ended 30 years of history of no strong earthquake at the Xianshuihe fault zone. The focal mechanism and centroid depth of the Kangding earthquake are inverted by teleseismic waveforms and regional seismograms with CAP method. The result shows that the two nodal planes of focal mechanism are 235°/82°/-173° and 144°/83°/-8° respectively, the latter nodal plane should be the ruptured fault plane with a focal depth of 9 km. The rupture process model of the Kangding earthquake is obtained by joint inversion of teleseismic data and regional seismograms. The Kangding earthquake is a bilateral earthquake, and the major rupture zone is within a depth range of 5-15 km, spanning 10 km and 12 km along dip and strike directions, and maximum slip is about 0.5m. Most seismic moment was released during the first 5 s and the magnitude is Mw6.01, smaller than the model determined by InSAR data. The discrepancy between co-seismic rupture models of the Kangding and its Ms 5.8 aftershock and the InSAR model implies significant afterslip deformation occurred in the two weeks after the mainshock. The afterslip released energy equals to an Mw5.9 earthquake and mainly concentrates in the northwest side and the shallower side to the rupture zone. The CFS accumulation near the epicenter of the 2014 Kangding earthquake is increased by the 2008 Wenchuan earthquake, implying that the Kangding earthquake could be triggered by the Wenchuan earthquake. The CFS at the northwest section of the seismic gap along the Kangding-daofu segment is increased by the Kanding earthquake, and the rupture slip of the Kangding earthquake sequence is too small to release the accumulated strain in the seismic gap. Consequently, the northwest section of the Kangding-daofu seismic gap is under high seismic hazard in the future.

  13. Regional earthquake loss estimation in the Autonomous Province of Bolzano - South Tyrol (Italy)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Winter, Benjamin

    2013-04-01

    Beside storm events geophysical events cause a majority of natural hazard losses on a global scale. However, in alpine regions with a moderate earthquake risk potential like in the study area and thereupon connected consequences on the collective memory this source of risk is often neglected in contrast to gravitational and hydrological hazards processes. In this context, the comparative analysis of potential disasters and emergencies on a national level in Switzerland (Katarisk study) has shown that earthquakes are the most serious source of risk in general. In order to estimate the potential losses of earthquake events for different return periods and loss dimensions of extreme events the following study was conducted in the Autonomous Province of Bolzano - South Tyrol (Italy). The applied methodology follows the generally accepted risk concept based on the risk components hazard, elements at risk and vulnerability, whereby risk is not defined holistically (direct, indirect, tangible and intangible) but with the risk category losses on buildings and inventory as a general risk proxy. The hazard analysis is based on a regional macroseismic scenario approach. Thereby, the settlement centre of each community (116 communities) is defined as potential epicentre. For each epicentre four different epicentral scenarios (return periods of 98, 475, 975 and 2475 years) are calculated based on the simple but approved and generally accepted attenuation law according to Sponheuer (1960). The relevant input parameters to calculate the epicentral scenarios are (i) the macroseismic intensity and (ii) the focal depth. The considered macroseismic intensities are based on a probabilistic seismic hazard analysis (PSHA) of the Italian earthquake catalogue on a community level (Dipartimento della Protezione Civile). The relevant focal depth are considered as a mean within a defined buffer of the focal depths of the harmonized earthquake catalogues of Italy and Switzerland as well as

  14. Predecessors of the giant 1960 Chile earthquake

    USGS Publications Warehouse

    Cisternas, M.; Atwater, B.F.; Torrejon, F.; Sawai, Y.; Machuca, G.; Lagos, M.; Eipert, A.; Youlton, C.; Salgado, I.; Kamataki, T.; Shishikura, M.; Rajendran, C.P.; Malik, J.K.; Rizal, Y.; Husni, M.

    2005-01-01

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended. ?? 2005 Nature Publishing Group.

  15. Predecessors of the giant 1960 Chile earthquake.

    PubMed

    Cisternas, Marco; Atwater, Brian F; Torrejón, Fernando; Sawai, Yuki; Machuca, Gonzalo; Lagos, Marcelo; Eipert, Annaliese; Youlton, Cristián; Salgado, Ignacio; Kamataki, Takanobu; Shishikura, Masanobu; Rajendran, C P; Malik, Javed K; Rizal, Yan; Husni, Muhammad

    2005-09-15

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended.

  16. Earthquake and Tsunami: a movie and a book for seismic and tsunami risk reduction in Italy.

    NASA Astrophysics Data System (ADS)

    Nostro, C.; Baroux, E.; Maramai, A.; Graziani, L.; Tertulliani, A.; Castellano, C.; Arcoraci, L.; Casale, P.; Ciaccio, M. G.; Frepoli, A.

    2009-04-01

    Italy is a country well known for the seismic and volcanic hazard. However, a similarly great hazard, although not well recognized, is posed by the occurrence of tsunami waves along the Italian coastline. This is testified by a rich catalogue and by field evidence of deposits left over by pre- and historical tsunamis, even in places today considered safe. This observation is of great importance since many of the areas affected by tsunamis in the past are today touristic places. The Italian tsunamis can be caused by different sources: 1- off-shore or near coast in-land earthquakes; 2- very large earthquakes on distant sources in the Mediterranean; 3- submarine volcanic explosion in the Tyrrhenian sea; 4- submarine landslides triggered by earthquakes and volcanic activity. The consequence of such a wide spectrum of sources is that an important part of the more than 7000 km long Italian coast line is exposed to the tsunami risk, and thousands of inhabitants (with numbers increasing during summer) live near hazardous coasts. The main historical tsunamis are the 1783 and 1908 events that hit Calabrian and Sicilian coasts. The recent tsunami is that caused by the 2002 Stromboli landslide. In order to reduce this risk and following the emotional impact of the December 2004 Sumatra earthquake and tsunami, we developed an outreach program consisting in talks given by scientists and in a movie and a book, both exploring the causes of the tsunami waves, how do they propagate in deep and shallow waters, and what are the effects on the coasts. Hints are also given on the most dangerous Italian coasts (as deduced by scientific studies), and how to behave in the case of a tsunami approaching the coast. These seminars are open to the general public, but special programs are developed with schools of all grades. In this talk we want to present the book and the movie used during the seminars and scientific expositions, that was realized from a previous 3D version originally

  17. GEOS seismograms for aftershocks of the earthquakes of December 7, 1988, near Spitak, Armenia SSR, during the time period 30 December 1988 14:00 through 2 January 1989 (UTC): Chapter D in Results and data from seismologic and geologic studies following earthquakes of December 7, 1988, near Spitak, Armenia SSR (Open-File Report 89-163)

    USGS Publications Warehouse

    Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward

    1989-01-01

    The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk

  18. Next-Day Earthquake Forecasts for California

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Jackson, D. D.; Kagan, Y. Y.

    2008-12-01

    We implemented a daily forecast of m > 4 earthquakes for California in the format suitable for testing in community-based earthquake predictability experiments: Regional Earthquake Likelihood Models (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). The forecast is based on near-real time earthquake reports from the ANSS catalog above magnitude 2 and will be available online. The model used to generate the forecasts is based on the Epidemic-Type Earthquake Sequence (ETES) model, a stochastic model of clustered and triggered seismicity. Our particular implementation is based on the earlier work of Helmstetter et al. (2006, 2007), but we extended the forecast to all of Cali-fornia, use more data to calibrate the model and its parameters, and made some modifications. Our forecasts will compete against the Short-Term Earthquake Probabilities (STEP) forecasts of Gersten-berger et al. (2005) and other models in the next-day testing class of the CSEP experiment in California. We illustrate our forecasts with examples and discuss preliminary results.

  19. Towards an Earthquake and Tsunami Early Warning in the Caribbean

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Vanacore, E. A.

    2017-12-01

    The Caribbean region (CR) has a documented history of large damaging earthquakes and tsunamis that have affected coastal areas, including the events of Jamaica in 1692, Virgin Islands in 1867, Puerto Rico in 1918, the Dominican Republic in 1946 and Haiti in 2010. There is clear evidence that tsunamis have been triggered by large earthquakes that deformed the ocean floor around the Caribbean Plate boundary. The CR is monitored jointly by national/regional/local seismic, geodetic and sea level networks. All monitoring institutions are participating in the UNESCO ICG/Caribe EWS, the purpose of this initiative is to minimize loss of life and destruction of property, and to mitigate against catastrophic economic impacts via promoting local research, real time (RT) earthquake, geodetic and sea level data sharing and improving warning capabilities and enhancing education and outreach strategies. Currently more than, 100 broad-band seismic, 65 sea levels and 50 GPS high rate stations are available in real or near real-time. These real-time streams are used by Local/Regional or Worldwide detection and warning institutions to provide earthquake source parameters in a timely manner. Currently, any Caribbean event detected to have a magnitude greater than 4.5 is evaluated, and sea level is measured, by the TWC for tsumanigenic potential. The regional cooperation is motivated both by research interests as well as geodetic, seismic and tsunami hazard monitoring and warning. It will allow the imaging of the tectonic structure of the Caribbean region to a high resolution which will consequently permit further understanding of the seismic source properties for moderate and large events and the application of this knowledge to procedures of civil protection. To reach its goals, the virtual network has been designed following the highest technical standards: BB sensors, 24 bits A/D converters with 140 dB dynamic range, real-time telemetry. Here we will discuss the state of the PR

  20. A comparison study of 2006 Java earthquake and other Tsunami earthquakes

    NASA Astrophysics Data System (ADS)

    Ji, C.; Shao, G.

    2006-12-01

    We revise the slip processes of July 17 2006 Java earthquakes by combined inverting teleseismic body wave, long period surface waves, as well as the broadband records at Christmas island (XMIS), which is 220 km away from the hypocenter and so far the closest observation for a Tsunami earthquake. Comparing with the previous studies, our approach considers the amplitude variations of surface waves with source depths as well as the contribution of ScS phase, which usually has amplitudes compatible with that of direct S phase for such low angle thrust earthquakes. The fault dip angles are also refined using the Love waves observed along fault strike direction. Our results indicate that the 2006 event initiated at a depth around 12 km and unilaterally rupture southeast for 150 sec with a speed of 1.0 km/sec. The revised fault dip is only about 6 degrees, smaller than the Harvard CMT (10.5 degrees) but consistent with that of 1994 Java earthquake. The smaller fault dip results in a larger moment magnitude (Mw=7.9) for a PREM earth, though it is dependent on the velocity structure used. After verified with 3D SEM forward simulation, we compare the inverted result with the revised slip models of 1994 Java and 1992 Nicaragua earthquakes derived using the same wavelet based finite fault inversion methodology.