Sample records for earthquake prediction experiment

  1. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Tsuruoka, H.; Yokoi, S.

    2011-12-01

    The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

  2. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Tsuruoka, H.; Yokoi, S.

    2013-12-01

    The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

  3. Earthquake Prediction in Large-scale Faulting Experiments

    NASA Astrophysics Data System (ADS)

    Junger, J.; Kilgore, B.; Beeler, N.; Dieterich, J.

    2004-12-01

    nucleation in these experiments is consistent with observations and theory of Dieterich and Kilgore (1996). Precursory strains can be detected typically after 50% of the total loading time. The Dieterich and Kilgore approach implies an alternative method of earthquake prediction based on comparing real-time strain monitoring with previous precursory strain records or with physically-based models of accelerating slip. Near failure, time to failure t is approximately inversely proportional to precursory slip rate V. Based on a least squares fit to accelerating slip velocity from ten or more events, the standard deviation of the residual between predicted and observed log t is typically 0.14. Scaling these results to natural recurrence suggests that a year prior to an earthquake, failure time can be predicted from measured fault slip rate with a typical error of 140 days, and a day prior to the earthquake with a typical error of 9 hours. However, such predictions require detecting aseismic nucleating strains, which have not yet been found in the field, and on distinguishing earthquake precursors from other strain transients. There is some field evidence of precursory seismic strain for large earthquakes (Bufe and Varnes, 1993) which may be related to our observations. In instances where precursory activity is spatially variable during the interseismic period, as in our experiments, distinguishing precursory activity might be best accomplished with deep arrays of near fault instruments and pattern recognition algorithms such as principle component analysis (Rundle et al., 2000).

  4. Introduction to the special issue on the 2004 Parkfield earthquake and the Parkfield earthquake prediction experiment

    USGS Publications Warehouse

    Harris, R.A.; Arrowsmith, J.R.

    2006-01-01

    The 28 September 2004 M 6.0 Parkfield earthquake, a long-anticipated event on the San Andreas fault, is the world's best recorded earthquake to date, with state-of-the-art data obtained from geologic, geodetic, seismic, magnetic, and electrical field networks. This has allowed the preearthquake and postearthquake states of the San Andreas fault in this region to be analyzed in detail. Analyses of these data provide views into the San Andreas fault that show a complex geologic history, fault geometry, rheology, and response of the nearby region to the earthquake-induced ground movement. Although aspects of San Andreas fault zone behavior in the Parkfield region can be modeled simply over geological time frames, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake indicate that predicting the fine details of future earthquakes is still a challenge. Instead of a deterministic approach, forecasting future damaging behavior, such as that caused by strong ground motions, will likely continue to require probabilistic methods. However, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake have provided ample data to understand most of what did occur in 2004, culminating in significant scientific advances.

  5. The earthquake prediction experiment at Parkfield, California

    USGS Publications Warehouse

    Roeloffs, E.; Langbein, J.

    1994-01-01

    Since 1985, a focused earthquake prediction experiment has been in progress along the San Andreas fault near the town of Parkfield in central California. Parkfield has experienced six moderate earthquakes since 1857 at average intervals of 22 years, the most recent a magnitude 6 event in 1966. The probability of another moderate earthquake soon appears high, but studies assigning it a 95% chance of occurring before 1993 now appear to have been oversimplified. The identification of a Parkfield fault "segment" was initially based on geometric features in the surface trace of the San Andreas fault, but more recent microearthquake studies have demonstrated that those features do not extend to seismogenic depths. On the other hand, geodetic measurements are consistent with the existence of a "locked" patch on the fault beneath Parkfield that has presently accumulated a slip deficit equal to the slip in the 1966 earthquake. A magnitude 4.7 earthquake in October 1992 brought the Parkfield experiment to its highest level of alert, with a 72-hour public warning that there was a 37% chance of a magnitude 6 event. However, this warning proved to be a false alarm. Most data collected at Parkfield indicate that strain is accumulating at a constant rate on this part of the San Andreas fault, but some interesting departures from this behavior have been recorded. Here we outline the scientific arguments bearing on when the next Parkfield earthquake is likely to occur and summarize geophysical observations to date.

  6. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, Susan E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  7. A forecast experiment of earthquake activity in Japan under Collaboratory for the Study of Earthquake Predictability (CSEP)

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Yokoi, S.; Nanjo, K. Z.; Tsuruoka, H.

    2012-04-01

    One major focus of the current Japanese earthquake prediction research program (2009-2013), which is now integrated with the research program for prediction of volcanic eruptions, is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. We started the 1st earthquake forecast testing experiment in Japan within the CSEP framework. We use the earthquake catalogue maintained and provided by the Japan Meteorological Agency (JMA). The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called "All Japan," "Mainland," and "Kanto." A total of 105 models were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. The experiments were completed for 92 rounds for 1-day, 6 rounds for 3-month, and 3 rounds for 1-year classes. For 1-day testing class all models passed all the CSEP's evaluation tests at more than 90% rounds. The results of the 3-month testing class also gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space distribution with most models when many earthquakes occurred at a spot. Now we prepare the 3-D forecasting experiment with a depth range of 0 to 100 km in Kanto region. The testing center is improving an evaluation system for 1-day class experiment to finish forecasting and testing results within one day. The special issue of 1st part titled Earthquake Forecast

  8. Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  9. The nature of earthquake prediction

    USGS Publications Warehouse

    Lindh, A.G.

    1991-01-01

    Earthquake prediction is inherently statistical. Although some people continue to think of earthquake prediction as the specification of the time, place, and magnitude of a future earthquake, it has been clear for at least a decade that this is an unrealistic and unreasonable definition. the reality is that earthquake prediction starts from the long-term forecasts of place and magnitude, with very approximate time constraints, and progresses, at least in principle, to a gradual narrowing of the time window as data and understanding permit. Primitive long-term forecasts are clearly possible at this time on a few well-characterized fault systems. Tightly focuses monitoring experiments aimed at short-term prediction are already underway in Parkfield, California, and in the Tokai region in Japan; only time will tell how much progress will be possible. 

  10. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  11. The U.S. Earthquake Prediction Program

    USGS Publications Warehouse

    Wesson, R.L.; Filson, J.R.

    1981-01-01

    There are two distinct motivations for earthquake prediction. The mechanistic approach aims to understand the processes leading to a large earthquake. The empirical approach is governed by the immediate need to protect lives and property. With our current lack of knowledge about the earthquake process, future progress cannot be made without gathering a large body of measurements. These are required not only for the empirical prediction of earthquakes, but also for the testing and development of hypotheses that further our understanding of the processes at work. The earthquake prediction program is basically a program of scientific inquiry, but one which is motivated by social, political, economic, and scientific reasons. It is a pursuit that cannot rely on empirical observations alone nor can it carried out solely on a blackboard or in a laboratory. Experiments must be carried out in the real Earth. 

  12. First Results of the Regional Earthquake Likelihood Models Experiment

    USGS Publications Warehouse

    Schorlemmer, D.; Zechar, J.D.; Werner, M.J.; Field, E.H.; Jackson, D.D.; Jordan, T.H.

    2010-01-01

    The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment-a truly prospective earthquake prediction effort-is underway within the U. S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary-the forecasts were meant for an application of 5 years-we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one. ?? 2010 The Author(s).

  13. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models: 2. Laboratory earthquakes

    NASA Astrophysics Data System (ADS)

    Rubinstein, Justin L.; Ellsworth, William L.; Beeler, Nicholas M.; Kilgore, Brian D.; Lockner, David A.; Savage, Heather M.

    2012-02-01

    The behavior of individual stick-slip events observed in three different laboratory experimental configurations is better explained by a "memoryless" earthquake model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. We make similar findings in the companion manuscript for the behavior of natural repeating earthquakes. Taken together, these results allow us to conclude that the predictions of a characteristic earthquake model that assumes either fixed slip or fixed recurrence interval should be preferred to the predictions of the time- and slip-predictable models for all earthquakes. Given that the fixed slip and recurrence models are the preferred models for all of the experiments we examine, we infer that in an event-to-event sense the elastic rebound model underlying the time- and slip-predictable models does not explain earthquake behavior. This does not indicate that the elastic rebound model should be rejected in a long-term-sense, but it should be rejected for short-term predictions. The time- and slip-predictable models likely offer worse predictions of earthquake behavior because they rely on assumptions that are too simple to explain the behavior of earthquakes. Specifically, the time-predictable model assumes a constant failure threshold and the slip-predictable model assumes that there is a constant minimum stress. There is experimental and field evidence that these assumptions are not valid for all earthquakes.

  14. First Results of the Regional Earthquake Likelihood Models Experiment

    NASA Astrophysics Data System (ADS)

    Schorlemmer, Danijel; Zechar, J. Douglas; Werner, Maximilian J.; Field, Edward H.; Jackson, David D.; Jordan, Thomas H.

    2010-08-01

    The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment—a truly prospective earthquake prediction effort—is underway within the U.S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary—the forecasts were meant for an application of 5 years—we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one.

  15. A prospective earthquake forecast experiment in the western Pacific

    NASA Astrophysics Data System (ADS)

    Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan

    2012-09-01

    Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.

  16. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    USGS Publications Warehouse

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  17. The October 1992 Parkfield, California, earthquake prediction

    USGS Publications Warehouse

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  18. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  19. Geophysical Anomalies and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  20. Testing an earthquake prediction algorithm

    USGS Publications Warehouse

    Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

    1997-01-01

    A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

  1. Earthquake predictions using seismic velocity ratios

    USGS Publications Warehouse

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  2. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    USGS Publications Warehouse

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  3. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    NASA Astrophysics Data System (ADS)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  4. A prospective earthquake forecast experiment for Japan

    NASA Astrophysics Data System (ADS)

    Yokoi, Sayoko; Nanjo, Kazuyoshi; Tsuruoka, Hiroshi; Hirata, Naoshi

    2013-04-01

    One major focus of the current Japanese earthquake prediction research program (2009-2013) is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. On 1 November in 2009, we started the 1st earthquake forecast testing experiment for the Japan area. We use the unified JMA catalogue compiled by the Japan Meteorological Agency as authorized catalogue. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called All Japan, Mainland, and Kanto. A total of 91 models were submitted to CSEP-Japan, and are evaluated with the CSEP official suite of tests about forecast performance. In this presentation, we show the results of the experiment of the 3-month testing class for 5 rounds. HIST-ETAS7pa, MARFS and RI10K models corresponding to the All Japan, Mainland and Kanto regions showed the best score based on the total log-likelihood. It is also clarified that time dependency of model parameters is no effective factor to pass the CSEP consistency tests for the 3-month testing class in all regions. Especially, spatial distribution in the All Japan region was too difficult to pass consistency test due to multiple events at a bin. Number of target events for a round in the Mainland region tended to be smaller than model's expectation during all rounds, which resulted in rejections of consistency test because of overestimation. In the Kanto region, pass ratios of consistency tests in each model showed more than 80%, which was associated with good balanced forecasting of event

  5. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Jackson, D. D.; Rhoades, D. A.; Zechar, J. D.; Marzocchi, W.

    2016-12-01

    The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 442 models under evaluation. The California testing center, started by SCEC, Sept 1, 2007, currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. Our tests are now based on the hypocentral locations and magnitudes of cataloged earthquakes, but we plan to test focal mechanisms, seismic hazard models, ground motion forecasts, and finite rupture forecasts as well. We have increased computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model, introduced Bayesian ensemble models, and implemented support for non-Poissonian simulation-based forecasts models. We are currently developing formats and procedures to evaluate externally hosted forecasts and predictions. CSEP supports the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. We found that earthquakes as small as magnitude 2.5 provide important information on subsequent earthquakes larger than magnitude 5. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence showed that some physics-based and hybrid models outperform catalog-based (e.g., ETAS) models. This experiment also demonstrates the ability of the CSEP infrastructure to support retrospective forecast testing. Current CSEP development activities include adoption of the Comprehensive Earthquake Catalog (ComCat) as an authorized data source, retrospective testing of simulation-based forecasts, and support for additive ensemble methods. We describe the open-source CSEP software that is available to researchers as

  6. A note on evaluating VAN earthquake predictions

    NASA Astrophysics Data System (ADS)

    Tselentis, G.-Akis; Melis, Nicos S.

    The evaluation of the success level of an earthquake prediction method should not be based on approaches that apply generalized strict statistical laws and avoid the specific nature of the earthquake phenomenon. Fault rupture processes cannot be compared to gambling processes. The outcome of the present note is that even an ideal earthquake prediction method is still shown to be a matter of a “chancy” association between precursors and earthquakes if we apply the same procedure proposed by Mulargia and Gasperini [1992] in evaluating VAN earthquake predictions. Each individual VAN prediction has to be evaluated separately, taking always into account the specific circumstances and information available. The success level of epicenter prediction should depend on the earthquake magnitude, and magnitude and time predictions may depend on earthquake clustering and the tectonic regime respectively.

  7. Scoring annual earthquake predictions in China

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang; Jiang, Changsheng

    2012-02-01

    The Annual Consultation Meeting on Earthquake Tendency in China is held by the China Earthquake Administration (CEA) in order to provide one-year earthquake predictions over most China. In these predictions, regions of concern are denoted together with the corresponding magnitude range of the largest earthquake expected during the next year. Evaluating the performance of these earthquake predictions is rather difficult, especially for regions that are of no concern, because they are made on arbitrary regions with flexible magnitude ranges. In the present study, the gambling score is used to evaluate the performance of these earthquake predictions. Based on a reference model, this scoring method rewards successful predictions and penalizes failures according to the risk (probability of being failure) that the predictors have taken. Using the Poisson model, which is spatially inhomogeneous and temporally stationary, with the Gutenberg-Richter law for earthquake magnitudes as the reference model, we evaluate the CEA predictions based on 1) a partial score for evaluating whether issuing the alarmed regions is based on information that differs from the reference model (knowledge of average seismicity level) and 2) a complete score that evaluates whether the overall performance of the prediction is better than the reference model. The predictions made by the Annual Consultation Meetings on Earthquake Tendency from 1990 to 2003 are found to include significant precursory information, but the overall performance is close to that of the reference model.

  8. Stigma in science: the case of earthquake prediction.

    PubMed

    Joffe, Helene; Rossetto, Tiziana; Bradley, Caroline; O'Connor, Cliodhna

    2018-01-01

    This paper explores how earthquake scientists conceptualise earthquake prediction, particularly given the conviction of six earthquake scientists for manslaughter (subsequently overturned) on 22 October 2012 for having given inappropriate advice to the public prior to the L'Aquila earthquake of 6 April 2009. In the first study of its kind, semi-structured interviews were conducted with 17 earthquake scientists and the transcribed interviews were analysed thematically. The scientists primarily denigrated earthquake prediction, showing strong emotive responses and distancing themselves from earthquake 'prediction' in favour of 'forecasting'. Earthquake prediction was regarded as impossible and harmful. The stigmatisation of the subject is discussed in the light of research on boundary work and stigma in science. The evaluation reveals how mitigation becomes the more favoured endeavour, creating a normative environment that disadvantages those who continue to pursue earthquake prediction research. Recommendations are made for communication with the public on earthquake risk, with a focus on how scientists portray uncertainty. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  9. Large-Scale Earthquake Countermeasures Act and the Earthquake Prediction Council in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rikitake, T.

    1979-08-07

    The Large-Scale Earthquake Countermeasures Act was enacted in Japan in December 1978. This act aims at mitigating earthquake hazards by designating an area to be an area under intensified measures against earthquake disaster, such designation being based on long-term earthquake prediction information, and by issuing an earthquake warnings statement based on imminent prediction information, when possible. In an emergency case as defined by the law, the prime minister will be empowered to take various actions which cannot be taken at ordinary times. For instance, he may ask the Self-Defense Force to come into the earthquake-threatened area before the earthquake occurrence.more » A Prediction Council has been formed in order to evaluate premonitory effects that might be observed over the Tokai area, which was designated an area under intensified measures against earthquake disaster some time in June 1979. An extremely dense observation network has been constructed over the area.« less

  10. Hypothesis testing and earthquake prediction.

    PubMed

    Jackson, D D

    1996-04-30

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

  11. Hypothesis testing and earthquake prediction.

    PubMed Central

    Jackson, D D

    1996-01-01

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

  12. Can We Predict Earthquakes?

    ScienceCinema

    Johnson, Paul

    2018-01-16

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  13. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Zechar, J. D.; Jordan, T. H.

    2015-12-01

    Maria Liukis, SCEC, USC; Maximilian Werner, University of Bristol; Danijel Schorlemmer, GFZ Potsdam; John Yu, SCEC, USC; Philip Maechling, SCEC, USC; Jeremy Zechar, Swiss Seismological Service, ETH; Thomas H. Jordan, SCEC, USC, and the CSEP Working Group The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 435 models under evaluation. The California testing center, operated by SCEC, has been operational since Sept 1, 2007, and currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. We have reduced testing latency, implemented prototype evaluation of M8 forecasts, and are currently developing formats and procedures to evaluate externally-hosted forecasts and predictions. These efforts are related to CSEP support of the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence has been completed, and the results indicate that some physics-based and hybrid models outperform purely statistical (e.g., ETAS) models. The experiment also demonstrates the power of the CSEP cyberinfrastructure for retrospective testing. Our current development includes evaluation strategies that increase computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model. We describe the open-source CSEP software that is available to researchers as they develop their forecast models (http://northridge.usc.edu/trac/csep/wiki/MiniCSEP). We also discuss applications of CSEP infrastructure to geodetic transient detection and how CSEP procedures are being

  14. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  15. The Parkfield earthquake prediction of October 1992; the emergency services response

    USGS Publications Warehouse

    Andrews, R.

    1992-01-01

    The science of earthquake prediction is interesting and worthy of support. In many respects the ultimate payoff of earthquake prediction or earthquake forecasting is how the information can be used to enhance public safety and public preparedness. This is a particularly important issue here in California where we have such a high level of seismic risk historically, and currently, as a consequence of activity in 1989 in the San Francisco Bay Area, in Humboldt County in April of this year (1992), and in southern California in the Landers-Big Bear area in late June of this year (1992). We are currently very concerned about the possibility of a major earthquake, one or more, happening close to one of our metropolitan areas. Within that context, the Parkfield experiment becomes very important. 

  16. Earthquake prediction; new studies yield promising results

    USGS Publications Warehouse

    Robinson, R.

    1974-01-01

    On Agust 3, 1973, a small earthquake (magnitude 2.5) occurred near Blue Mountain Lake in the Adirondack region of northern New York State. This seemingly unimportant event was of great significance, however, because it was predicted. Seismologsits at the Lamont-Doherty geologcal Observatory of Columbia University accurately foretold the time, place, and magnitude of the event. Their prediction was based on certain pre-earthquake processes that are best explained by a hypothesis known as "dilatancy," a concept that has injected new life and direction into the science of earthquake prediction. Although much mroe reserach must be accomplished before we can expect to predict potentially damaging earthquakes with any degree of consistency, results such as this indicate that we are on a promising road. 

  17. Strong ground motion prediction using virtual earthquakes.

    PubMed

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  18. Quantitative Earthquake Prediction on Global and Regional Scales

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.

    2006-03-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  19. Geochemical challenge to earthquake prediction.

    PubMed Central

    Wakita, H

    1996-01-01

    The current status of geochemical and groundwater observations for earthquake prediction in Japan is described. The development of the observations is discussed in relation to the progress of the earthquake prediction program in Japan. Three major findings obtained from our recent studies are outlined. (i) Long-term radon observation data over 18 years at the SKE (Suikoen) well indicate that the anomalous radon change before the 1978 Izu-Oshima-kinkai earthquake can with high probability be attributed to precursory changes. (ii) It is proposed that certain sensitive wells exist which have the potential to detect precursory changes. (iii) The appearance and nonappearance of coseismic radon drops at the KSM (Kashima) well reflect changes in the regional stress state of an observation area. In addition, some preliminary results of chemical changes of groundwater prior to the 1995 Kobe (Hyogo-ken nanbu) earthquake are presented. PMID:11607665

  20. Intermediate-term earthquake prediction

    USGS Publications Warehouse

    Knopoff, L.

    1990-01-01

    The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

  1. 76 FR 69761 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-09

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey National Earthquake Prediction Evaluation... 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\-day meeting.... Geological Survey on proposed earthquake predictions, on the completeness and scientific validity of the...

  2. 76 FR 19123 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-06

    ... Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S. Geological Survey, Interior. ACTION: Notice of meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

  3. Prediction of earthquake-triggered landslide event sizes

    NASA Astrophysics Data System (ADS)

    Braun, Anika; Havenith, Hans-Balder; Schlögel, Romy

    2016-04-01

    Seismically induced landslides are a major environmental effect of earthquakes, which may significantly contribute to related losses. Moreover, in paleoseismology landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes and thus allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. We present here a review of factors contributing to earthquake triggered slope failures based on an "event-by-event" classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, 'Intensity', 'Fault', 'Topographic energy', 'Climatic conditions' and 'Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. The relative weight of these factors was extracted from published data for numerous past earthquakes; topographic inputs were checked in Google Earth and through geographic information systems. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be cross-checked. One of our main findings is that the 'Fault' factor, which is based on characteristics of the fault, the surface rupture and its location with respect to mountain areas, has the most important

  4. On Earthquake Prediction in Japan

    PubMed Central

    UYEDA, Seiya

    2013-01-01

    Japan’s National Project for Earthquake Prediction has been conducted since 1965 without success. An earthquake prediction should be a short-term prediction based on observable physical phenomena or precursors. The main reason of no success is the failure to capture precursors. Most of the financial resources and manpower of the National Project have been devoted to strengthening the seismographs networks, which are not generally effective for detecting precursors since many of precursors are non-seismic. The precursor research has never been supported appropriately because the project has always been run by a group of seismologists who, in the present author’s view, are mainly interested in securing funds for seismology — on pretense of prediction. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this decision has been further fortified by the 2011 M9 Tohoku Mega-quake. On top of the National Project, there are other government projects, not formally but vaguely related to earthquake prediction, that consume many orders of magnitude more funds. They are also un-interested in short-term prediction. Financially, they are giants and the National Project is a dwarf. Thus, in Japan now, there is practically no support for short-term prediction research. Recently, however, substantial progress has been made in real short-term prediction by scientists of diverse disciplines. Some promising signs are also arising even from cooperation with private sectors. PMID:24213204

  5. CSEP-Japan: The Japanese node of the collaboratory for the study of earthquake predictability

    NASA Astrophysics Data System (ADS)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2011-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project of earthquake predictability research. The final goal of this project is to have a look for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined the CSEP and started the Japanese testing center called as CSEP-Japan. This testing center constitutes an open access to researchers contributing earthquake forecast models for applied to Japan. A total of 91 earthquake forecast models were submitted on the prospective experiment starting from 1 November 2009. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by the CSEP. The experiments of 1-day, 3-month, 1-year and 3-year forecasting classes were implemented for 92 rounds, 4 rounds, 1round and 0 round (now in progress), respectively. The results of the 3-month class gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space-distribution with most models in some cases where many earthquakes occurred at the same spot. Throughout the experiment, it has been clarified that some properties of the CSEP's evaluation tests such as the L-test show strong correlation with the N-test. We are now processing to own (cyber-) infrastructure to support the forecast experiment as follows. (1) Japanese seismicity has changed since the 2011 Tohoku earthquake. The 3rd call for forecasting models was announced in order to promote model improvement for forecasting earthquakes after this earthquake. So, we provide Japanese seismicity catalog maintained by JMA for modelers to study how seismicity

  6. Understanding earthquake from the granular physics point of view — Causes of earthquake, earthquake precursors and predictions

    NASA Astrophysics Data System (ADS)

    Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.

  7. Material contrast does not predict earthquake rupture propagation direction

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    2005-01-01

    Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

  8. Four Examples of Short-Term and Imminent Prediction of Earthquakes

    NASA Astrophysics Data System (ADS)

    zeng, zuoxun; Liu, Genshen; Wu, Dabin; Sibgatulin, Victor

    2014-05-01

    We show here 4 examples of short-term and imminent prediction of earthquakes in China last year. They are Nima Earthquake(Ms5.2), Minxian Earthquake(Ms6.6), Nantou Earthquake (Ms6.7) and Dujiangyan Earthquake (Ms4.1) Imminent Prediction of Nima Earthquake(Ms5.2) Based on the comprehensive analysis of the prediction of Victor Sibgatulin using natural electromagnetic pulse anomalies and the prediction of Song Song and Song Kefu using observation of a precursory halo, and an observation for the locations of a degasification of the earth in the Naqu, Tibet by Zeng Zuoxun himself, the first author made a prediction for an earthquake around Ms 6 in 10 days in the area of the degasification point (31.5N, 89.0 E) at 0:54 of May 8th, 2013. He supplied another degasification point (31N, 86E) for the epicenter prediction at 8:34 of the same day. At 18:54:30 of May 15th, 2013, an earthquake of Ms5.2 occurred in the Nima County, Naqu, China. Imminent Prediction of Minxian Earthquake (Ms6.6) At 7:45 of July 22nd, 2013, an earthquake occurred at the border between Minxian and Zhangxian of Dingxi City (34.5N, 104.2E), Gansu province with magnitude of Ms6.6. We review the imminent prediction process and basis for the earthquake using the fingerprint method. 9 channels or 15 channels anomalous components - time curves can be outputted from the SW monitor for earthquake precursors. These components include geomagnetism, geoelectricity, crust stresses, resonance, crust inclination. When we compress the time axis, the outputted curves become different geometric images. The precursor images are different for earthquake in different regions. The alike or similar images correspond to earthquakes in a certain region. According to the 7-year observation of the precursor images and their corresponding earthquake, we usually get the fingerprint 6 days before the corresponding earthquakes. The magnitude prediction needs the comparison between the amplitudes of the fingerpringts from the same

  9. Testing an Earthquake Prediction Algorithm: The 2016 New Zealand and Chile Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.

    2017-05-01

    The 13 November 2016, M7.8, 54 km NNE of Amberley, New Zealand and the 25 December 2016, M7.6, 42 km SW of Puerto Quellon, Chile earthquakes happened outside the area of the on-going real-time global testing of the intermediate-term middle-range earthquake prediction algorithm M8, accepted in 1992 for the M7.5+ range. Naturally, over the past two decades, the level of registration of earthquakes worldwide has grown significantly and by now is sufficient for diagnosis of times of increased probability (TIPs) by the M8 algorithm on the entire territory of New Zealand and Southern Chile as far as below 40°S. The mid-2016 update of the M8 predictions determines TIPs in the additional circles of investigation (CIs) where the two earthquakes have happened. Thus, after 50 semiannual updates in the real-time prediction mode, we (1) confirm statistically approved high confidence of the M8-MSc predictions and (2) conclude a possibility of expanding the territory of the Global Test of the algorithms M8 and MSc in an apparently necessary revision of the 1992 settings.

  10. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  11. Earthquake prediction evaluation standards applied to the VAN Method

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.

  12. A test to evaluate the earthquake prediction algorithm, M8

    USGS Publications Warehouse

    Healy, John H.; Kossobokov, Vladimir G.; Dewey, James W.

    1992-01-01

    A test of the algorithm M8 is described. The test is constructed to meet four rules, which we propose to be applicable to the test of any method for earthquake prediction:  1. An earthquake prediction technique should be presented as a well documented, logical algorithm that can be used by  investigators without restrictions. 2. The algorithm should be coded in a common programming language and implementable on widely available computer systems. 3. A test of the earthquake prediction technique should involve future predictions with a black box version of the algorithm in which potentially adjustable parameters are fixed in advance. The source of the input data must be defined and ambiguities in these data must be resolved automatically by the algorithm. 4. At least one reasonable null hypothesis should be stated in advance of testing the earthquake prediction method, and it should be stated how this null hypothesis will be used to estimate the statistical significance of the earthquake predictions. The M8 algorithm has successfully predicted several destructive earthquakes, in the sense that the earthquakes occurred inside regions with linear dimensions from 384 to 854 km that the algorithm had identified as being in times of increased probability for strong earthquakes. In addition, M8 has successfully "post predicted" high percentages of strong earthquakes in regions to which it has been applied in retroactive studies. The statistical significance of previous predictions has not been established, however, and post-prediction studies in general are notoriously subject to success-enhancement through hindsight. Nor has it been determined how much more precise an M8 prediction might be than forecasts and probability-of-occurrence estimates made by other techniques. We view our test of M8 both as a means to better determine the effectiveness of M8 and as an experimental structure within which to make observations that might lead to improvements in the algorithm

  13. Testing prediction methods: Earthquake clustering versus the Poisson model

    USGS Publications Warehouse

    Michael, A.J.

    1997-01-01

    Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.

  14. Discussion of New Approaches to Medium-Short-Term Earthquake Forecast in Practice of The Earthquake Prediction in Yunnan

    NASA Astrophysics Data System (ADS)

    Hong, F.

    2017-12-01

    After retrospection of years of practice of the earthquake prediction in Yunnan area, it is widely considered that the fixed-point earthquake precursory anomalies mainly reflect the field information. The increase of amplitude and number of precursory anomalies could help to determine the original time of earthquakes, however it is difficult to obtain the spatial relevance between earthquakes and precursory anomalies, thus we can hardly predict the spatial locations of earthquakes using precursory anomalies. The past practices have shown that the seismic activities are superior to the precursory anomalies in predicting earthquakes locations, resulting from the increased seismicity were observed before 80% M=6.0 earthquakes in Yunnan area. While the mobile geomagnetic anomalies are turned out to be helpful in predicting earthquakes locations in recent year, for instance, the forecasted earthquakes occurring time and area derived form the 1-year-scale geomagnetic anomalies before the M6.5 Ludian earthquake in 2014 are shorter and smaller than which derived from the seismicity enhancement region. According to the past works, the author believes that the medium-short-term earthquake forecast level, as well as objective understanding of the seismogenic mechanisms, could be substantially improved by the densely laying observation array and capturing the dynamic process of physical property changes in the enhancement region of medium to small earthquakes.

  15. 75 FR 63854 - National Earthquake Prediction Evaluation Council (NEPEC) Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-18

    ... DEPARTMENT OF THE INTERIOR Geological Survey National Earthquake Prediction Evaluation Council...: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 2... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

  16. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    USGS Publications Warehouse

    Kilb, Debi; Gomberg, J.

    1999-01-01

    We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

  17. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  18. Earthquake prediction: the interaction of public policy and science.

    PubMed Central

    Jones, L M

    1996-01-01

    Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake. PMID:11607656

  19. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    PubMed

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  20. Triggering Factor of Strong Earthquakes and Its Prediction Verification

    NASA Astrophysics Data System (ADS)

    Ren, Z. Q.; Ren, S. H.

    After 30 yearsS research, we have found that great earthquakes are triggered by tide- generation force of the moon. ItSs not the tide-generation force in classical view- points, but is a non-classical viewpoint tide-generation force. We call it as TGFR (Tide-Generation ForcesS Resonance). TGFR strongly depends on the tide-generation force at time of the strange astronomical points (SAP). The SAP mostly are when the moon and another celestial body are arranged with the earth along a straight line (with the same apparent right ascension or 180o difference), the other SAP are the turning points of the moonSs relatively motion to the earth. Moreover, TGFR have four different types effective areas. Our study indicates that a majority of earthquakes are triggering by the rare superimposition of TGFRsS effective areas. In China the great earthquakes in the plain area of Hebei Province, Taiwan, Yunnan Province and Sichuan province are trigger by the decompression TGFR; Other earthquakes are trig- gered by compression TGFR which are in Gansu Province, Ningxia Provinces and northwest direction of Beijing. The great earthquakes in Japan, California, southeast of Europe also are triggered by compression of the TGFR. and in the other part of the world like in Philippines, Central America countries, and West Asia, great earthquakes are triggered by decompression TGFR. We have carried out examinational immediate prediction cooperate TGFR method with other earthquake impending signals such as suggested by Professor Li Junzhi. The successful ratio is about 40%(from our fore- cast reports to the China Seismological Administration). Thus we could say the great earthquake can be predicted (include immediate earthquake prediction). Key words: imminent prediction; triggering factor; TGFR (Tide-Generation ForcesS Resonance); TGFR compression; TGFR compression zone; TGFR decompression; TGFR decom- pression zone

  1. Using remote sensing to predict earthquake impacts

    NASA Astrophysics Data System (ADS)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  2. Sociological aspects of earthquake prediction

    USGS Publications Warehouse

    Spall, H.

    1979-01-01

    Henry Spall talked recently with Denis Mileti who is in the Department of Sociology, Colorado State University, Fort Collins, Colo. Dr. Mileti is a sociologst involved with research programs that study the socioeconomic impact of earthquake prediction

  3. Earthquake forecasts for the CSEP Japan experiment based on the RI algorithm

    NASA Astrophysics Data System (ADS)

    Nanjo, K. Z.

    2011-03-01

    An earthquake forecast testing experiment for Japan, the first of its kind, is underway within the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) under a controlled environment. Here we give an overview of the earthquake forecast models, based on the RI algorithm, which we have submitted to the CSEP Japan experiment. Models have been submitted to a total of 9 categories, corresponding to 3 testing classes (3 years, 1 year, and 3 months) and 3 testing regions. The RI algorithm is originally a binary forecast system based on the working assumption that large earthquakes are more likely to occur in the future at locations of higher seismicity in the past. It is based on simple counts of the number of past earthquakes, which is called the Relative Intensity (RI) of seismicity. To improve its forecast performance, we first expand the RI algorithm by introducing spatial smoothing. We then convert the RI representation from a binary system to a CSEP-testable model that produces forecasts for the number of earthquakes of predefined magnitudes. We use information on past seismicity to tune the parameters. The final submittal consists of 36 executable computer codes: 4 variants corresponding to different smoothing parameters for each of the 9 categories. They will help to elucidate which categories and which smoothing parameters are the most meaningful for the RI hypothesis. The main purpose of our participation in the experiment is to better understand the significance of the relative intensity of seismicity for earthquake forecastability in Japan.

  4. Earthquake prediction in Japan and natural time analysis of seismicity

    NASA Astrophysics Data System (ADS)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  5. Are Earthquakes Predictable? A Study on Magnitude Correlations in Earthquake Catalog and Experimental Data

    NASA Astrophysics Data System (ADS)

    Stavrianaki, K.; Ross, G.; Sammonds, P. R.

    2015-12-01

    The clustering of earthquakes in time and space is widely accepted, however the existence of correlations in earthquake magnitudes is more questionable. In standard models of seismic activity, it is usually assumed that magnitudes are independent and therefore in principle unpredictable. Our work seeks to test this assumption by analysing magnitude correlation between earthquakes and their aftershocks. To separate mainshocks from aftershocks, we perform stochastic declustering based on the widely used Epidemic Type Aftershock Sequence (ETAS) model, which allows us to then compare the average magnitudes of aftershock sequences to that of their mainshock. The results of earthquake magnitude correlations were compared with acoustic emissions (AE) from laboratory analog experiments, as fracturing generates both AE at the laboratory scale and earthquakes on a crustal scale. Constant stress and constant strain rate experiments were done on Darley Dale sandstone under confining pressure to simulate depth of burial. Microcracking activity inside the rock volume was analyzed by the AE technique as a proxy for earthquakes. Applying the ETAS model to experimental data allowed us to validate our results and provide for the first time a holistic view on the correlation of earthquake magnitudes. Additionally we search the relationship between the conditional intensity estimates of the ETAS model and the earthquake magnitudes. A positive relation would suggest the existence of magnitude correlations. The aim of this study is to observe any trends of dependency between the magnitudes of aftershock earthquakes and the earthquakes that trigger them.

  6. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

    NASA Astrophysics Data System (ADS)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

    2017-07-01

    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  7. Earthquake Prediction in a Big Data World

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  8. Risk and return: evaluating Reverse Tracing of Precursors earthquake predictions

    NASA Astrophysics Data System (ADS)

    Zechar, J. Douglas; Zhuang, Jiancang

    2010-09-01

    In 2003, the Reverse Tracing of Precursors (RTP) algorithm attracted the attention of seismologists and international news agencies when researchers claimed two successful predictions of large earthquakes. These researchers had begun applying RTP to seismicity in Japan, California, the eastern Mediterranean and Italy; they have since applied it to seismicity in the northern Pacific, Oregon and Nevada. RTP is a pattern recognition algorithm that uses earthquake catalogue data to declare alarms, and these alarms indicate that RTP expects a moderate to large earthquake in the following months. The spatial extent of alarms is highly variable and each alarm typically lasts 9 months, although the algorithm may extend alarms in time and space. We examined the record of alarms and outcomes since the prospective application of RTP began, and in this paper we report on the performance of RTP to date. To analyse these predictions, we used a recently developed approach based on a gambling score, and we used a simple reference model to estimate the prior probability of target earthquakes for each alarm. Formally, we believe that RTP investigators did not rigorously specify the first two `successful' predictions in advance of the relevant earthquakes; because this issue is contentious, we consider analyses with and without these alarms. When we included contentious alarms, RTP predictions demonstrate statistically significant skill. Under a stricter interpretation, the predictions are marginally unsuccessful.

  9. Prospects for earthquake prediction and control

    USGS Publications Warehouse

    Healy, J.H.; Lee, W.H.K.; Pakiser, L.C.; Raleigh, C.B.; Wood, M.D.

    1972-01-01

    The San Andreas fault is viewed, according to the concepts of seafloor spreading and plate tectonics, as a transform fault that separates the Pacific and North American plates and along which relative movements of 2 to 6 cm/year have been taking place. The resulting strain can be released by creep, by earthquakes of moderate size, or (as near San Francisco and Los Angeles) by great earthquakes. Microearthquakes, as mapped by a dense seismograph network in central California, generally coincide with zones of the San Andreas fault system that are creeping. Microearthquakes are few and scattered in zones where elastic energy is being stored. Changes in the rate of strain, as recorded by tiltmeter arrays, have been observed before several earthquakes of about magnitude 4. Changes in fluid pressure may control timing of seismic activity and make it possible to control natural earthquakes by controlling variations in fluid pressure in fault zones. An experiment in earthquake control is underway at the Rangely oil field in Colorado, where the rates of fluid injection and withdrawal in experimental wells are being controlled. ?? 1972.

  10. Empirical models for the prediction of ground motion duration for intraplate earthquakes

    NASA Astrophysics Data System (ADS)

    Anbazhagan, P.; Neaz Sheikh, M.; Bajaj, Ketan; Mariya Dayana, P. J.; Madhura, H.; Reddy, G. R.

    2017-07-01

    Many empirical relationships for the earthquake ground motion duration were developed for interplate region, whereas only a very limited number of empirical relationships exist for intraplate region. Also, the existing relationships were developed based mostly on the scaled recorded interplate earthquakes to represent intraplate earthquakes. To the author's knowledge, none of the existing relationships for the intraplate regions were developed using only the data from intraplate regions. Therefore, an attempt is made in this study to develop empirical predictive relationships of earthquake ground motion duration (i.e., significant and bracketed) with earthquake magnitude, hypocentral distance, and site conditions (i.e., rock and soil sites) using the data compiled from intraplate regions of Canada, Australia, Peninsular India, and the central and southern parts of the USA. The compiled earthquake ground motion data consists of 600 records with moment magnitudes ranging from 3.0 to 6.5 and hypocentral distances ranging from 4 to 1000 km. The non-linear mixed-effect (NLMEs) and logistic regression techniques (to account for zero duration) were used to fit predictive models to the duration data. The bracketed duration was found to be decreased with an increase in the hypocentral distance and increased with an increase in the magnitude of the earthquake. The significant duration was found to be increased with the increase in the magnitude and hypocentral distance of the earthquake. Both significant and bracketed durations were predicted higher in rock sites than in soil sites. The predictive relationships developed herein are compared with the existing relationships for interplate and intraplate regions. The developed relationship for bracketed duration predicts lower durations for rock and soil sites. However, the developed relationship for a significant duration predicts lower durations up to a certain distance and thereafter predicts higher durations compared to the

  11. Predictability of population displacement after the 2010 Haiti earthquake

    PubMed Central

    Lu, Xin; Bengtsson, Linus; Holme, Petter

    2012-01-01

    Most severe disasters cause large population movements. These movements make it difficult for relief organizations to efficiently reach people in need. Understanding and predicting the locations of affected people during disasters is key to effective humanitarian relief operations and to long-term societal reconstruction. We collaborated with the largest mobile phone operator in Haiti (Digicel) and analyzed the movements of 1.9 million mobile phone users during the period from 42 d before, to 341 d after the devastating Haiti earthquake of January 12, 2010. Nineteen days after the earthquake, population movements had caused the population of the capital Port-au-Prince to decrease by an estimated 23%. Both the travel distances and size of people’s movement trajectories grew after the earthquake. These findings, in combination with the disorder that was present after the disaster, suggest that people’s movements would have become less predictable. Instead, the predictability of people’s trajectories remained high and even increased slightly during the three-month period after the earthquake. Moreover, the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds. For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought. PMID:22711804

  12. Applications of the gambling score in evaluating earthquake predictions and forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2010-05-01

    This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.

  13. Sun-earth environment study to understand earthquake prediction

    NASA Astrophysics Data System (ADS)

    Mukherjee, S.

    2007-05-01

    Earthquake prediction is possible by looking into the location of active sunspots before it harbours energy towards earth. Earth is a restless planet the restlessness turns deadly occasionally. Of all natural hazards, earthquakes are the most feared. For centuries scientists working in seismically active regions have noted premonitory signals. Changes in thermosphere, Ionosphere, atmosphere and hydrosphere are noted before the changes in geosphere. The historical records talk of changes of the water level in wells, of strange weather, of ground-hugging fog, of unusual behaviour of animals (due to change in magnetic field of the earth) that seem to feel the approach of a major earthquake. With the advent of modern science and technology the understanding of these pre-earthquake signals has become stronger enough to develop a methodology of earthquake prediction. A correlation of earth directed coronal mass ejection (CME) from the active sunspots has been possible to develop as a precursor of the earthquake. Occasional local magnetic field and planetary indices (Kp values) changes in the lower atmosphere that is accompanied by the formation of haze and a reduction of moisture in the air. Large patches, often tens to hundreds of thousands of square kilometres in size, seen in night-time infrared satellite images where the land surface temperature seems to fluctuate rapidly. Perturbations in the ionosphere at 90 - 120 km altitude have been observed before the occurrence of earthquakes. These changes affect the transmission of radio waves and a radio black out has been observed due to CME. Another heliophysical parameter Electron flux (Eflux) has been monitored before the occurrence of the earthquakes. More than hundreds of case studies show that before the occurrence of the earthquakes the atmospheric temperature increases and suddenly drops before the occurrence of the earthquakes. These changes are being monitored by using Sun Observatory Heliospheric observatory

  14. Next-Day Earthquake Forecasts for California

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Jackson, D. D.; Kagan, Y. Y.

    2008-12-01

    We implemented a daily forecast of m > 4 earthquakes for California in the format suitable for testing in community-based earthquake predictability experiments: Regional Earthquake Likelihood Models (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). The forecast is based on near-real time earthquake reports from the ANSS catalog above magnitude 2 and will be available online. The model used to generate the forecasts is based on the Epidemic-Type Earthquake Sequence (ETES) model, a stochastic model of clustered and triggered seismicity. Our particular implementation is based on the earlier work of Helmstetter et al. (2006, 2007), but we extended the forecast to all of Cali-fornia, use more data to calibrate the model and its parameters, and made some modifications. Our forecasts will compete against the Short-Term Earthquake Probabilities (STEP) forecasts of Gersten-berger et al. (2005) and other models in the next-day testing class of the CSEP experiment in California. We illustrate our forecasts with examples and discuss preliminary results.

  15. Testing for the 'predictability' of dynamically triggered earthquakes in The Geysers geothermal field

    NASA Astrophysics Data System (ADS)

    Aiken, Chastity; Meng, Xiaofeng; Hardebeck, Jeanne

    2018-03-01

    The Geysers geothermal field is well known for being susceptible to dynamic triggering of earthquakes by large distant earthquakes, owing to the introduction of fluids for energy production. Yet, it is unknown if dynamic triggering of earthquakes is 'predictable' or whether dynamic triggering could lead to a potential hazard for energy production. In this paper, our goal is to investigate the characteristics of triggering and the physical conditions that promote triggering to determine whether or not triggering is in anyway foreseeable. We find that, at present, triggering in The Geysers is not easily 'predictable' in terms of when and where based on observable physical conditions. However, triggered earthquake magnitude positively correlates with peak imparted dynamic stress, and larger dynamic stresses tend to trigger sequences similar to mainshock-aftershock sequences. Thus, we may be able to 'predict' what size earthquakes to expect at The Geysers following a large distant earthquake.

  16. Signals of ENPEMF Used in Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Hao, G.; Dong, H.; Zeng, Z.; Wu, G.; Zabrodin, S. M.

    2012-12-01

    The signals of Earth's natural pulse electromagnetic field (ENPEMF) is a combination of the abnormal crustal magnetic field pulse affected by the earthquake, the induced field of earth's endogenous magnetic field, the induced magnetic field of the exogenous variation magnetic field, geomagnetic pulsation disturbance and other energy coupling process between sun and earth. As an instantaneous disturbance of the variation field of natural geomagnetism, ENPEMF can be used to predict earthquakes. This theory was introduced by A.A Vorobyov, who expressed a hypothesis that pulses can arise not only in the atmosphere but within the Earth's crust due to processes of tectonic-to-electric energy conversion (Vorobyov, 1970; Vorobyov, 1979). The global field time scale of ENPEMF signals has specific stability. Although the wave curves may not overlap completely at different regions, the smoothed diurnal ENPEMF patterns always exhibit the same trend per month. The feature is a good reference for observing the abnormalities of the Earth's natural magnetic field in a specific region. The frequencies of the ENPEMF signals generally locate in kilo Hz range, where frequencies within 5-25 kilo Hz range can be applied to monitor earthquakes. In Wuhan, the best observation frequency is 14.5 kilo Hz. Two special devices are placed in accordance with the S-N and W-E direction. Dramatic variation from the comparison between the pulses waveform obtained from the instruments and the normal reference envelope diagram should indicate high possibility of earthquake. The proposed detection method of earthquake based on ENPEMF can improve the geodynamic monitoring effect and can enrich earthquake prediction methods. We suggest the prospective further researches are about on the exact sources composition of ENPEMF signals, the distinction between noise and useful signals, and the effect of the Earth's gravity tide and solid tidal wave. This method may also provide a promising application in

  17. Earthquake prediction research at the Seismological Laboratory, California Institute of Technology

    USGS Publications Warehouse

    Spall, H.

    1979-01-01

    Nevertheless, basic earthquake-related information has always been of consuming interest to the public and the media in this part of California (fig. 2.). So it is not surprising that earthquake prediction continues to be a significant reserach program at the laboratory. Several of the current spectrum of projects related to prediction are discussed below. 

  18. Dim prospects for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  19. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    NASA Astrophysics Data System (ADS)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel YouTube.com/INGVterremoti to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV

  20. Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-12-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.

  1. Earthquake prediction with electromagnetic phenomena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp; Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo; Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQsmore » prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.« less

  2. Seismo-induced effects in the near-earth space: Combined ground and space investigations as a contribution to earthquake prediction

    NASA Astrophysics Data System (ADS)

    Sgrigna, V.; Buzzi, A.; Conti, L.; Picozza, P.; Stagni, C.; Zilpimiani, D.

    2007-02-01

    The paper aims at giving a few methodological suggestions in deterministic earthquake prediction studies based on combined ground-based and space observations of earthquake precursors. Up to now what is lacking is the demonstration of a causal relationship with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. Coordinated space and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of LEO satellites. At this purpose a new result reported in the paper is an original and specific space mission project (ESPERIA) and two instruments of its payload. The ESPERIA space project has been performed for the Italian Space Agency and three ESPERIA instruments (ARINA and LAZIO particle detectors, and EGLE search-coil magnetometer) have been built and tested in space. The EGLE experiment started last April 15, 2005 on board the ISS, within the ENEIDE mission. The launch of ARINA occurred on June 15, 2006, on board the RESURS DK-1 Russian LEO satellite. As an introduction and justification to these experiments the paper clarifies some basic concepts and critical methodological aspects concerning deterministic and statistic approaches and their use in earthquake prediction. We also take the liberty of giving the scientific community a few critical hints based on our personal experience in the field and propose a joint study devoted to earthquake prediction and warning.

  3. Long-term predictability of regions and dates of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    Results on the long-term predictability of strong earthquakes are discussed. It is shown that dates of earthquakes with M>5.5 could be determined in advance of several months before the event. The magnitude and the region of approaching earthquake could be specified in the time-frame of a month before the event. Determination of number of M6+ earthquakes, which are expected to occur during the analyzed year, is performed using the special sequence diagram of seismic activity for the century time frame. Date analysis could be performed with advance of 15-20 years. Data is verified by a monthly sequence diagram of seismic activity. The number of strong earthquakes expected to occur in the analyzed month is determined by several methods having a different prediction horizon. Determination of days of potential earthquakes with M5.5+ is performed using astronomical data. Earthquakes occur on days of oppositions of Solar System planets (arranged in a single line). At that, the strongest earthquakes occur under the location of vector "Sun-Solar System barycenter" in the ecliptic plane. Details of this astronomical multivariate indicator still require further research, but it's practical significant is confirmed by practice. Another one empirical indicator of approaching earthquake M6+ is a synchronous variation of meteorological parameters: abrupt decreasing of minimal daily temperature, increasing of relative humidity, abrupt change of atmospheric pressure (RAMES method). Time difference of predicted and actual date is no more than one day. This indicator is registered 104 days before the earthquake, so it was called as Harmonic 104 or H-104. This fact looks paradoxical, but the works of A. Sytinskiy and V. Bokov on the correlation of global atmospheric circulation and seismic events give a physical basis for this empirical fact. Also, 104 days is a quarter of a Chandler period so this fact gives insight on the correlation between the anomalies of Earth orientation

  4. Earthquake prediction in seismogenic areas of the Iberian Peninsula based on computational intelligence

    NASA Astrophysics Data System (ADS)

    Morales-Esteban, A.; Martínez-Álvarez, F.; Reyes, J.

    2013-05-01

    A method to predict earthquakes in two of the seismogenic areas of the Iberian Peninsula, based on Artificial Neural Networks (ANNs), is presented in this paper. ANNs have been widely used in many fields but only very few and very recent studies have been conducted on earthquake prediction. Two kinds of predictions are provided in this study: a) the probability of an earthquake, of magnitude equal or larger than a preset threshold magnitude, within the next 7 days, to happen; b) the probability of an earthquake of a limited magnitude interval to happen, during the next 7 days. First, the physical fundamentals related to earthquake occurrence are explained. Second, the mathematical model underlying ANNs is explained and the configuration chosen is justified. Then, the ANNs have been trained in both areas: The Alborán Sea and the Western Azores-Gibraltar fault. Later, the ANNs have been tested in both areas for a period of time immediately subsequent to the training period. Statistical tests are provided showing meaningful results. Finally, ANNs were compared to other well known classifiers showing quantitatively and qualitatively better results. The authors expect that the results obtained will encourage researchers to conduct further research on this topic. Development of a system capable of predicting earthquakes for the next seven days Application of ANN is particularly reliable to earthquake prediction. Use of geophysical information modeling the soil behavior as ANN's input data Successful analysis of one region with large seismic activity

  5. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    NASA Astrophysics Data System (ADS)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  6. Implications of fault constitutive properties for earthquake prediction

    USGS Publications Warehouse

    Dieterich, J.H.; Kilgore, B.

    1996-01-01

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance D(c), apparent fracture energy at a rupture front, time- dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of D, apply to faults in nature. However, scaling of D(c) is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  7. Implications of fault constitutive properties for earthquake prediction.

    PubMed Central

    Dieterich, J H; Kilgore, B

    1996-01-01

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks. Images Fig. 3 PMID:11607666

  8. Implications of fault constitutive properties for earthquake prediction.

    PubMed

    Dieterich, J H; Kilgore, B

    1996-04-30

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  9. Measurement of neutron and charged particle fluxes toward earthquake prediction

    NASA Astrophysics Data System (ADS)

    Maksudov, Asatulla U.; Zufarov, Mars A.

    2017-12-01

    In this paper, we describe a possible method for predicting the earthquakes, which is based on simultaneous recording of the intensity of fluxes of neutrons and charged particles by detectors, commonly used in nuclear physics. These low-energy particles originate from radioactive nuclear processes in the Earth's crust. The variations in the particle flux intensity can be the precursor of the earthquake. A description is given of an electronic installation that records the fluxes of charged particles in the radial direction, which are a possible response to the accumulated tectonic stresses in the Earth's crust. The obtained results showed an increase in the intensity of the fluxes for 10 or more hours before the occurrence of the earthquake. The previous version of the installation was able to indicate for the possibility of an earthquake (Maksudov et al. in Instrum Exp Tech 58:130-131, 2015), but did not give information about the direction of the epicenter location. In this regard, the installation was modified by adding eight directional detectors. With the upgraded setup, we have received both the predictive signals, and signals determining the directions of the location of the forthcoming earthquake, starting 2-3 days before its origin.

  10. Earthquake prediction rumors can help in building earthquake awareness: the case of May the 11th 2011 in Rome (Italy)

    NASA Astrophysics Data System (ADS)

    Amato, A.; Arcoraci, L.; Casarotti, E.; Cultrera, G.; Di Stefano, R.; Margheriti, L.; Nostro, C.; Selvaggi, G.; May-11 Team

    2012-04-01

    Banner headlines in an Italian newspaper read on May 11, 2011: "Absence boom in offices: the urban legend in Rome become psychosis". This was the effect of a large-magnitude earthquake prediction in Rome for May 11, 2011. This prediction was never officially released, but it grew up in Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment and this increased the earthquake prediction credibility. Given the echo of this earthquake prediction, INGV decided to organize on May 11 (the same day the earthquake was predicted to happen) an Open Day in its headquarter in Rome to inform on the Italian seismicity and the earthquake physics. The Open Day was preceded by a press conference two days before, attended by about 40 journalists from newspapers, local and national TV's, press agencies and web news magazines. Hundreds of articles appeared in the following two days, advertising the 11 May Open Day. On May 11 the INGV headquarter was peacefully invaded by over 3,000 visitors from 9am to 9pm: families, students, civil protection groups and many journalists. The program included conferences on a wide variety of subjects (from social impact of rumors to seismic risk reduction) and distribution of books and brochures, in addition to several activities: meetings with INGV researchers to discuss scientific issues, visits to the seismic monitoring room (open 24h/7 all year), guided tours through interactive exhibitions on earthquakes and Earth's deep structure. During the same day, thirteen new videos have also been posted on our youtube/INGVterremoti channel to explain the earthquake process and hazard, and to provide real time periodic updates on seismicity in Italy. On May 11 no large earthquake happened in Italy. The initiative, built up in few weeks, had a very large feedback

  11. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    NASA Astrophysics Data System (ADS)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    For 20 years the South Iceland Seismic Zone (SISZ) was a test site for multinational earthquake prediction research, partly bridging the gap between laboratory tests samples, and the huge transform zones of the Earth. The approach was to explore the physics of processes leading up to large earthquakes. The book Advances in Earthquake Prediction, Research and Risk Mitigation, by R. Stefansson (2011), published by Springer/PRAXIS, and an article in the August issue of the BSSA by Stefansson, M. Bonafede and G. Gudmundsson (2011) contain a good overview of the findings, and more references, as well as examples of partially successful long and short term warnings based on such an approach. Significant findings are: Earthquakes that occurred hundreds of years ago left scars in the crust, expressed in volumes of heterogeneity that demonstrate the size of their faults. Rheology and stress heterogeneity within these volumes are significantly variable in time and space. Crustal processes in and near such faults may be observed by microearthquake information decades before the sudden onset of a new large earthquake. High pressure fluids of mantle origin may in response to strain, especially near plate boundaries, migrate upward into the brittle/elastic crust to play a significant role in modifying crustal conditions on a long and short term. Preparatory processes of various earthquakes can not be expected to be the same. We learn about an impending earthquake by observing long term preparatory processes at the fault, finding a constitutive relationship that governs the processes, and then extrapolating that relationship into near space and future. This is a deterministic approach in earthquake prediction research. Such extrapolations contain many uncertainties. However the long time pattern of observations of the pre-earthquake fault process will help us to put probability constraints on our extrapolations and our warnings. The approach described is different from the usual

  12. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    NASA Astrophysics Data System (ADS)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  13. Landscape scale prediction of earthquake-induced landsliding based on seismological and geomorphological parameters.

    NASA Astrophysics Data System (ADS)

    Marc, O.; Hovius, N.; Meunier, P.; Rault, C.

    2017-12-01

    In tectonically active areas, earthquakes are an important trigger of landslides with significant impact on hillslopes and river evolutions. However, detailed prediction of landslides locations and properties for a given earthquakes remain difficult.In contrast we propose, landscape scale, analytical prediction of bulk coseismic landsliding, that is total landslide area and volume (Marc et al., 2016a) as well as the regional area within which most landslide must distribute (Marc et al., 2017). The prediction is based on a limited number of seismological (seismic moment, source depth) and geomorphological (landscape steepness, threshold acceleration) parameters, and therefore could be implemented in landscape evolution model aiming at engaging with erosion dynamics at the scale of the seismic cycle. To assess the model we have compiled and normalized estimates of total landslide volume, total landslide area and regional area affected by landslides for 40, 17 and 83 earthquakes, respectively. We have found that low landscape steepness systematically leads to overprediction of the total area and volume of landslides. When this effect is accounted for, the model is able to predict within a factor of 2 the landslide areas and associated volumes for about 70% of the cases in our databases. The prediction of regional area affected do not require a calibration for the landscape steepness and gives a prediction within a factor of 2 for 60% of the database. For 7 out of 10 comprehensive inventories we show that our prediction compares well with the smallest region around the fault containing 95% of the total landslide area. This is a significant improvement on a previously published empirical expression based only on earthquake moment.Some of the outliers seems related to exceptional rock mass strength in the epicentral area or shaking duration and other seismic source complexities ignored by the model. Applications include prediction on the mass balance of earthquakes and

  14. The Ordered Network Structure and Prediction Summary for M≥7 Earthquakes in Xinjiang Region of China

    NASA Astrophysics Data System (ADS)

    Men, Ke-Pei; Zhao, Kai

    2014-12-01

    M ≥7 earthquakes have showed an obvious commensurability and orderliness in Xinjiang of China and its adjacent region since 1800. The main orderly values are 30 a × k (k = 1,2,3), 11 12 a, 41 43 a, 18 19 a, and 5 6 a. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered network structure analysis with complex network technology, we focus on the prediction summary of M ≥ 7 earthquakes by using the ordered network structure, and add new information to further optimize network, hence construct the 2D- and 3D-ordered network structure of M ≥ 7 earthquakes. In this paper, the network structure revealed fully the regularity of seismic activity of M ≥ 7 earthquakes in the study region during the past 210 years. Based on this, the Karakorum M7.1 earthquake in 1996, the M7.9 earthquake on the frontier of Russia, Mongol, and China in 2003, and two Yutian M7.3 earthquakes in 2008 and 2014 were predicted successfully. At the same time, a new prediction opinion is presented that the future two M ≥ 7 earthquakes will probably occur around 2019 - 2020 and 2025 - 2026 in this region. The results show that large earthquake occurred in defined region can be predicted. The method of ordered network structure analysis produces satisfactory results for the mid-and-long term prediction of M ≥ 7 earthquakes.

  15. Source Model of Huge Subduction Earthquakes for Strong Ground Motion Prediction

    NASA Astrophysics Data System (ADS)

    Iwata, T.; Asano, K.

    2012-12-01

    It is a quite important issue for strong ground motion prediction to construct the source model of huge subduction earthquakes. Irikura and Miyake (2001, 2011) proposed the characterized source model for strong ground motion prediction, which consists of plural strong ground motion generation area (SMGA, Miyake et al., 2003) patches on the source fault. We obtained the SMGA source models for many events using the empirical Green's function method and found the SMGA size has an empirical scaling relationship with seismic moment. Therefore, the SMGA size can be assumed from that empirical relation under giving the seismic moment for anticipated earthquakes. Concerning to the setting of the SMGAs position, the information of the fault segment is useful for inland crustal earthquakes. For the 1995 Kobe earthquake, three SMGA patches are obtained and each Nojima, Suma, and Suwayama segment respectively has one SMGA from the SMGA modeling (e.g. Kamae and Irikura, 1998). For the 2011 Tohoku earthquake, Asano and Iwata (2012) estimated the SMGA source model and obtained four SMGA patches on the source fault. Total SMGA area follows the extension of the empirical scaling relationship between the seismic moment and the SMGA area for subduction plate-boundary earthquakes, and it shows the applicability of the empirical scaling relationship for the SMGA. The positions of two SMGAs are in Miyagi-Oki segment and those other two SMGAs are in Fukushima-Oki and Ibaraki-Oki segments, respectively. Asano and Iwata (2012) also pointed out that all SMGAs are corresponding to the historical source areas of 1930's. Those SMGAs do not overlap the huge slip area in the shallower part of the source fault which estimated by teleseismic data, long-period strong motion data, and/or geodetic data during the 2011 mainshock. This fact shows the huge slip area does not contribute to strong ground motion generation (10-0.1s). The information of the fault segment in the subduction zone, or

  16. Construction of Source Model of Huge Subduction Earthquakes for Strong Ground Motion Prediction

    NASA Astrophysics Data System (ADS)

    Iwata, T.; Asano, K.; Kubo, H.

    2013-12-01

    It is a quite important issue for strong ground motion prediction to construct the source model of huge subduction earthquakes. Iwata and Asano (2012, AGU) summarized the scaling relationships of large slip area of heterogeneous slip model and total SMGA sizes on seismic moment for subduction earthquakes and found the systematic change between the ratio of SMGA to the large slip area and the seismic moment. They concluded this tendency would be caused by the difference of period range of source modeling analysis. In this paper, we try to construct the methodology of construction of the source model for strong ground motion prediction for huge subduction earthquakes. Following to the concept of the characterized source model for inland crustal earthquakes (Irikura and Miyake, 2001; 2011) and intra-slab earthquakes (Iwata and Asano, 2011), we introduce the proto-type of the source model for huge subduction earthquakes and validate the source model by strong ground motion modeling.

  17. "Earthquake!"--A Cooperative Learning Experience.

    ERIC Educational Resources Information Center

    Hodder, A. Peter W.

    2001-01-01

    Presents an exercise designed as a team building experience for managers that can be used to demonstrate to science students the potential benefit of group decision-making. Involves the ranking of options for surviving a large earthquake. Yields quantitative measures of individual student knowledge and how well the groups function. (Author/YDS)

  18. 78 FR 64973 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... updates on past topics of discussion, including work with social and behavioral scientists on improving... probabilities; USGS collaborative work with the Collaboratory for Study of Earthquake Predictability (CSEP...

  19. Possibility of Earthquake-prediction by analyzing VLF signals

    NASA Astrophysics Data System (ADS)

    Ray, Suman; Chakrabarti, Sandip Kumar; Sasmal, Sudipta

    2016-07-01

    Prediction of seismic events is one of the most challenging jobs for the scientific community. Conventional ways for prediction of earthquakes are to monitor crustal structure movements, though this method has not yet yield satisfactory results. Furthermore, this method fails to give any short-term prediction. Recently, it is noticed that prior to any seismic event a huge amount of energy is released which may create disturbances in the lower part of D-layer/E-layer of the ionosphere. This ionospheric disturbance may be used as a precursor of earthquakes. Since VLF radio waves propagate inside the wave-guide formed by lower ionosphere and Earth's surface, this signal may be used to identify ionospheric disturbances due to seismic activity. We have analyzed VLF signals to find out the correlations, if any, between the VLF signal anomalies and seismic activities. We have done both the case by case study and also the statistical analysis using a whole year data. In both the methods we found that the night time amplitude of VLF signals fluctuated anomalously three days before the seismic events. Also we found that the terminator time of the VLF signals shifted anomalously towards night time before few days of any major seismic events. We calculate the D-layer preparation time and D-layer disappearance time from the VLF signals. We have observed that this D-layer preparation time and D-layer disappearance time become anomalously high 1-2 days before seismic events. Also we found some strong evidences which indicate that it may possible to predict the location of epicenters of earthquakes in future by analyzing VLF signals for multiple propagation paths.

  20. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries.

    PubMed

    Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.

  1. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries

    PubMed Central

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006–2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year. PMID:26812351

  2. In-situ fluid-pressure measurements for earthquake prediction: An example from a deep well at Hi Vista, California

    USGS Publications Warehouse

    Healy, J.H.; Urban, T.C.

    1985-01-01

    Short-term earthquake prediction requires sensitive instruments for measuring the small anomalous changes in stress and strain that precede earthquakes. Instruments installed at or near the surface have proven too noisy for measuring anomalies of the size expected to occur, and it is now recognized that even to have the possibility of a reliable earthquake-prediction system will require instruments installed in drill holes at depths sufficient to reduce the background noise to a level below that of the expected premonitory signals. We are conducting experiments to determine the maximum signal-to-noise improvement that can be obtained in drill holes. In a 592 m well in the Mojave Desert near Hi Vista, California, we measured water-level changes with amplitudes greater than 10 cm, induced by earth tides. By removing the effects of barometric pressure and the stress related to earth tides, we have achieved a sensitivity to volumetric strain rates of 10-9 to 10-10 per day. Further improvement may be possible, and it appears that a successful earthquake-prediction capability may be achieved with an array of instruments installed in drill holes at depths of about 1 km, assuming that the premonitory strain signals are, in fact, present. ?? 1985 Birkha??user Verlag.

  3. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  4. Safety and survival in an earthquake

    USGS Publications Warehouse

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  5. State public policy issues involved with the Parkfield prediction experiment.

    USGS Publications Warehouse

    Andrews, R.; Goltz, J.

    1988-01-01

    The earthquake-prediction experiment at Parkfield may well be the most important such experiment currently underway worldwide. Its importance, however, extends beyond the scientific data that will be gathered and whether those data that will be gathered and whether those data can provide reliable prediction methods. Important public policy lessons are being learned (and are yet to be learned), and these lessons may be transferable to other parts of California and the nation. Indeed, the Parkfield experiment has captured the interest of numerous Californians, including State officials, emergency managers, the news media, and at least some of the public.

  6. Statistical short-term earthquake prediction.

    PubMed

    Kagan, Y Y; Knopoff, L

    1987-06-19

    A statistical procedure, derived from a theoretical model of fracture growth, is used to identify a foreshock sequence while it is in progress. As a predictor, the procedure reduces the average uncertainty in the rate of occurrence for a future strong earthquake by a factor of more than 1000 when compared with the Poisson rate of occurrence. About one-third of all main shocks with local magnitude greater than or equal to 4.0 in central California can be predicted in this way, starting from a 7-year database that has a lower magnitude cut off of 1.5. The time scale of such predictions is of the order of a few hours to a few days for foreshocks in the magnitude range from 2.0 to 5.0.

  7. Induced earthquake during the 2016 Kumamoto earthquake (Mw7.0): Importance of real-time shake monitoring for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Ogiso, M.

    2016-12-01

    hypocenter location and magnitude. Because we want to predict ground shaking in EEW, we should more focus on monitoring of ground shaking. Experience of the induced earthquake also indicates the importance of the real-time monitor of ground shaking for making EEW more rapid and precise.

  8. Gambling scores for earthquake predictions and forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  9. A Hybrid Ground-Motion Prediction Equation for Earthquakes in Western Alberta

    NASA Astrophysics Data System (ADS)

    Spriggs, N.; Yenier, E.; Law, A.; Moores, A. O.

    2015-12-01

    Estimation of ground-motion amplitudes that may be produced by future earthquakes constitutes the foundation of seismic hazard assessment and earthquake-resistant structural design. This is typically done by using a prediction equation that quantifies amplitudes as a function of key seismological variables such as magnitude, distance and site condition. In this study, we develop a hybrid empirical prediction equation for earthquakes in western Alberta, where evaluation of seismic hazard associated with induced seismicity is of particular interest. We use peak ground motions and response spectra from recorded seismic events to model the regional source and attenuation attributes. The available empirical data is limited in the magnitude range of engineering interest (M>4). Therefore, we combine empirical data with a simulation-based model in order to obtain seismologically informed predictions for moderate-to-large magnitude events. The methodology is two-fold. First, we investigate the shape of geometrical spreading in Alberta. We supplement the seismic data with ground motions obtained from mining/quarry blasts, in order to gain insights into the regional attenuation over a wide distance range. A comparison of ground-motion amplitudes for earthquakes and mining/quarry blasts show that both event types decay at similar rates with distance and demonstrate a significant Moho-bounce effect. In the second stage, we calibrate the source and attenuation parameters of a simulation-based prediction equation to match the available amplitude data from seismic events. We model the geometrical spreading using a trilinear function with attenuation rates obtained from the first stage, and calculate coefficients of anelastic attenuation and site amplification via regression analysis. This provides a hybrid ground-motion prediction equation that is calibrated for observed motions in western Alberta and is applicable to moderate-to-large magnitude events.

  10. Prediction of Earthquakes by Lunar Cicles

    NASA Astrophysics Data System (ADS)

    Rodriguez, G.

    2007-05-01

    Prediction of Earthquakes by Lunar Cicles Author ; Guillermo Rodriguez Rodriguez Afiliation Geophysic and Astrophysicist. Retired I have exposed this idea to many meetings of EGS, UGS, IUGG 95, from 80, 82.83,and AGU 2002 Washington and 2003 Niza I have thre aproximition in Time 1º Earthquakes hapen The same day of the years every 18 or 19 years (cicle Saros ) Some times in the same place or anhother very far . In anhother moments of the year , teh cicle can be are ; 14 years, 26 years, 32 years or the multiples o 18.61 years expecial 55, 93, 224, 150 ,300 etcetc. For To know the day in the year 2º Over de cicle o one Lunation ( Days over de date of new moon) The greats Earthquakes hapens with diferents intervals of days in the sucesives lunations (aproximately one month) like we can be see in the grafic enclosed. For to know the day of month 3º Over each day I have find that each 28 day repit aproximately the same hour and minute. The same longitude and the same latitud in all earthquakes , also the littles ones . This is very important because we can to proposse only the precaution of wait it in the street or squares Whenever some times the cicles can be longuers or more littles This is my special way of cientific metode As consecuence of the 1º and 2º principe we can look The correlation between years separated by cicles of the 1º tipe For example 1984 and 2002 0r 2003 and consecutive years include 2007...During 30 years I have look de dates. I am in my subconcense the way but I can not make it in scientific formalisme

  11. New predictive equations for Arias intensity from crustal earthquakes in New Zealand

    NASA Astrophysics Data System (ADS)

    Stafford, Peter J.; Berrill, John B.; Pettinga, Jarg R.

    2009-01-01

    Arias Intensity (Arias, MIT Press, Cambridge MA, pp 438-483, 1970) is an important measure of the strength of a ground motion, as it is able to simultaneously reflect multiple characteristics of the motion in question. Recently, the effectiveness of Arias Intensity as a predictor of the likelihood of damage to short-period structures has been demonstrated, reinforcing the utility of Arias Intensity for use in both structural and geotechnical applications. In light of this utility, Arias Intensity has begun to be considered as a ground-motion measure suitable for use in probabilistic seismic hazard analysis (PSHA) and earthquake loss estimation. It is therefore timely to develop predictive equations for this ground-motion measure. In this study, a suite of four predictive equations, each using a different functional form, is derived for the prediction of Arias Intensity from crustal earthquakes in New Zealand. The provision of a suite of models is included to allow for epistemic uncertainty to be considered within a PSHA framework. Coefficients are presented for four different horizontal-component definitions for each of the four models. The ground-motion dataset for which the equations are derived include records from New Zealand crustal earthquakes as well as near-field records from worldwide crustal earthquakes. The predictive equations may be used to estimate Arias Intensity for moment magnitudes between 5.1 and 7.5 and for distances (both rjb and rrup) up to 300 km.

  12. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments

    USGS Publications Warehouse

    Chang, Jefferson C.; Lockner, David A.; Reches, Z.

    2012-01-01

    After nucleation, a large earthquake propagates as an expanding rupture front along a fault. This front activates countless fault patches that slip by consuming energy stored in Earth’s crust. We simulated the slip of a fault patch by rapidly loading an experimental fault with energy stored in a spinning flywheel. The spontaneous evolution of strength, acceleration, and velocity indicates that our experiments are proxies of fault-patch behavior during earthquakes of moment magnitude (Mw) = 4 to 8. We show that seismically determined earthquake parameters (e.g., displacement, velocity, magnitude, or fracture energy) can be used to estimate the intensity of the energy release during an earthquake. Our experiments further indicate that high acceleration imposed by the earthquake’s rupture front quickens dynamic weakening by intense wear of the fault zone.

  13. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    NASA Astrophysics Data System (ADS)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  14. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  15. Predicting Posttraumatic Stress Symptom Prevalence and Local Distribution after an Earthquake with Scarce Data.

    PubMed

    Dussaillant, Francisca; Apablaza, Mauricio

    2017-08-01

    After a major earthquake, the assignment of scarce mental health emergency personnel to different geographic areas is crucial to the effective management of the crisis. The scarce information that is available in the aftermath of a disaster may be valuable in helping predict where are the populations that are in most need. The objectives of this study were to derive algorithms to predict posttraumatic stress (PTS) symptom prevalence and local distribution after an earthquake and to test whether there are algorithms that require few input data and are still reasonably predictive. A rich database of PTS symptoms, informed after Chile's 2010 earthquake and tsunami, was used. Several model specifications for the mean and centiles of the distribution of PTS symptoms, together with posttraumatic stress disorder (PTSD) prevalence, were estimated via linear and quantile regressions. The models varied in the set of covariates included. Adjusted R2 for the most liberal specifications (in terms of numbers of covariates included) ranged from 0.62 to 0.74, depending on the outcome. When only including peak ground acceleration (PGA), poverty rate, and household damage in linear and quadratic form, predictive capacity was still good (adjusted R2 from 0.59 to 0.67 were obtained). Information about local poverty, household damage, and PGA can be used as an aid to predict PTS symptom prevalence and local distribution after an earthquake. This can be of help to improve the assignment of mental health personnel to the affected localities. Dussaillant F , Apablaza M . Predicting posttraumatic stress symptom prevalence and local distribution after an earthquake with scarce data. Prehosp Disaster Med. 2017;32(4):357-367.

  16. Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  17. Foreshock sequences and short-term earthquake predictability on East Pacific Rise transform faults.

    PubMed

    McGuire, Jeffrey J; Boettcher, Margaret S; Jordan, Thomas H

    2005-03-24

    East Pacific Rise transform faults are characterized by high slip rates (more than ten centimetres a year), predominantly aseismic slip and maximum earthquake magnitudes of about 6.5. Using recordings from a hydroacoustic array deployed by the National Oceanic and Atmospheric Administration, we show here that East Pacific Rise transform faults also have a low number of aftershocks and high foreshock rates compared to continental strike-slip faults. The high ratio of foreshocks to aftershocks implies that such transform-fault seismicity cannot be explained by seismic triggering models in which there is no fundamental distinction between foreshocks, mainshocks and aftershocks. The foreshock sequences on East Pacific Rise transform faults can be used to predict (retrospectively) earthquakes of magnitude 5.4 or greater, in narrow spatial and temporal windows and with a high probability gain. The predictability of such transform earthquakes is consistent with a model in which slow slip transients trigger earthquakes, enrich their low-frequency radiation and accommodate much of the aseismic plate motion.

  18. Study on China’s Earthquake Prediction by Mathematical Analysis and its Application in Catastrophe Insurance

    NASA Astrophysics Data System (ADS)

    Jianjun, X.; Bingjie, Y.; Rongji, W.

    2018-03-01

    The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.

  19. Application of a time-magnitude prediction model for earthquakes

    NASA Astrophysics Data System (ADS)

    An, Weiping; Jin, Xueshen; Yang, Jialiang; Dong, Peng; Zhao, Jun; Zhang, He

    2007-06-01

    In this paper we discuss the physical meaning of the magnitude-time model parameters for earthquake prediction. The gestation process for strong earthquake in all eleven seismic zones in China can be described by the magnitude-time prediction model using the computations of the parameters of the model. The average model parameter values for China are: b = 0.383, c=0.154, d = 0.035, B = 0.844, C = -0.209, and D = 0.188. The robustness of the model parameters is estimated from the variation in the minimum magnitude of the transformed data, the spatial extent, and the temporal period. Analysis of the spatial and temporal suitability of the model indicates that the computation unit size should be at least 4° × 4° for seismic zones in North China, at least 3° × 3° in Southwest and Northwest China, and the time period should be as long as possible.

  20. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments

    NASA Astrophysics Data System (ADS)

    Chang, J. C.; Lockner, D. A.; Reches, Z.

    2012-12-01

    We simulated the slip of a fault-patch during a large earthquake by rapidly loading an experimental, ring-shaped fault with energy stored in a spinning flywheel. The flywheel abruptly delivers a finite amount of energy by spinning the fault-patch that spontaneously dissipates the energy without operator intervention. We conducted 42 experiments on Sierra White granite (SWG) samples, and 24 experiments on Kasota dolomite (KD) samples. Each experiment starts by spinning a 225 kg disk-shaped flywheel to a prescribed angular velocity. We refer to this experiment as an "earthquake-like slip-event" (ELSE). The strength-evolution in ELSE experiments is similar to the strength-evolution proposed for earthquake models and observed in stick-slip experiments. Further, we found that ELSE experiments are similar to earthquakes in at least three ways: (1) slip driven by the release of a finite amount of stored energy; (2) pattern of fault strength evolution; and (3) seismically observed values, such as average slip, peak-velocity and rise-time. By assuming that the measured slip, D, in ELSE experiments is equivalent to the average slip during an earthquake, we found that ELSE experiments (D = 0.003-4.6 m) correspond to earthquakes in moment-magnitude range of Mw = 4-8. In ELSE experiments, the critical-slip-distance, dc, has mean values of 2.7 cm and 1.2 cm for SWG and KD, that are much shorter than the 1-10 m in steady-state classical experiments in rotary shear systems. We attribute these dc values, to ELSE loading in which the fault-patch is abruptly loaded by impact with a spinning flywheel. Under this loading, the friction-velocity relations are strikingly different from those under steady-state loading on the same rock samples with the same shear system (Reches and Lockner, Nature, 2010). We further note that the slip acceleration in ELSE evolves systematically with fault strength and wear-rate, and that the dynamic weakening is restricted to the period of intense

  1. Acoustic Emission Detected by Matched Filter Technique in Laboratory Earthquake Experiment

    NASA Astrophysics Data System (ADS)

    Wang, B.; Hou, J.; Xie, F.; Ren, Y.

    2017-12-01

    Acoustic Emission in laboratory earthquake experiment is a fundamental measures to study the mechanics of the earthquake for instance to characterize the aseismic, nucleation, as well as post seismic phase or in stick slip experiment. Compared to field earthquake, AEs are generally recorded when they are beyond threshold, so some weak signals may be missing. Here we conducted an experiment on a 1.1m×1.1m granite with a 1.5m fault and 13 receivers with the same sample rate of 3MHz are placed on the surface. We adopt continues record and a matched filter technique to detect low-SNR signals. We found there are too many signals around the stick-slip and the P- arrival picked by manual may be time-consuming. So, we combined the short-term average to long-tem-average ratio (STA/LTA) technique with Autoregressive-Akaike information criterion (AR-AIC) technique to pick the arrival automatically and found mostly of the P- arrival accuracy can satisfy our demand to locate signals. Furthermore, we will locate the signals and apply a matched filter technique to detect low-SNR signals. Then, we can see if there is something interesting in laboratory earthquake experiment. Detailed and updated results will be present in the meeting.

  2. Feasibility study of short-term earthquake prediction using ionospheric anomalies immediately before large earthquakes

    NASA Astrophysics Data System (ADS)

    Heki, K.; He, L.

    2017-12-01

    We showed that positive and negative electron density anomalies emerge above the fault immediately before they rupture, 40/20/10 minutes before Mw9/8/7 earthquakes (Heki, 2011 GRL; Heki and Enomoto, 2013 JGR; He and Heki 2017 JGR). These signals are stronger for earthquake with larger Mw and under higher background vertical TEC (total electron conetent) (Heki and Enomoto, 2015 JGR). The epicenter, the positive and the negative anomalies align along the local geomagnetic field (He and Heki, 2016 GRL), suggesting electric fields within ionosphere are responsible for making the anomalies (Kuo et al., 2014 JGR; Kelley et al., 2017 JGR). Here we suppose the next Nankai Trough earthquake that may occur within a few tens of years in Southwest Japan, and will discuss if we can recognize its preseismic signatures in TEC by real-time observations with GNSS.During high geomagnetic activities, large-scale traveling ionospheric disturbances (LSTID) often propagate from auroral ovals toward mid-latitude regions, and leave similar signatures to preseismic anomalies. This is a main obstacle to use preseismic TEC changes for practical short-term earthquake prediction. In this presentation, we show that the same anomalies appeared 40 minutes before the mainshock above northern Australia, the geomagnetically conjugate point of the 2011 Tohoku-oki earthquake epicenter. This not only demonstrates that electric fields play a role in making the preseismic TEC anomalies, but also offers a possibility to discriminate preseismic anomalies from those caused by LSTID. By monitoring TEC in the conjugate areas in the two hemisphere, we can recognize anomalies with simultaneous onset as those caused by within-ionosphere electric fields (e.g. preseismic anomalies, night-time MSTID) and anomalies without simultaneous onset as gravity-wave origin disturbances (e.g. LSTID, daytime MSTID).

  3. Real time numerical shake prediction incorporating attenuation structure: a case for the 2016 Kumamoto Earthquake

    NASA Astrophysics Data System (ADS)

    Ogiso, M.; Hoshiba, M.; Shito, A.; Matsumoto, S.

    2016-12-01

    Needless to say, heterogeneous attenuation structure is important for ground motion prediction, including earthquake early warning, that is, real time ground motion prediction. Hoshiba and Ogiso (2015, AGU Fall meeting) showed that the heterogeneous attenuation and scattering structure will lead to earlier and more accurate ground motion prediction in the numerical shake prediction scheme proposed by Hoshiba and Aoki (2015, BSSA). Hoshiba and Ogiso (2015) used assumed heterogeneous structure, and we discuss the effect of them in the case of 2016 Kumamoto Earthquake, using heterogeneous structure estimated by actual observation data. We conducted Multiple Lapse Time Window Analysis (Hoshiba, 1993, JGR) to the seismic stations located on western part of Japan to estimate heterogeneous attenuation and scattering structure. The characteristics are similar to the previous work of Carcole and Sato (2010, GJI), e.g. strong intrinsic and scattering attenuation around the volcanoes located on the central part of Kyushu, and relatively weak heterogeneities in the other area. Real time ground motion prediction simulation for the 2016 Kumamoto Earthquake was conducted using the numerical shake prediction scheme with 474 strong ground motion stations. Comparing the snapshot of predicted and observed wavefield showed a tendency for underprediction around the volcanic area in spite of the heterogeneous structure. These facts indicate the necessity of improving the heterogeneous structure for the numerical shake prediction scheme.In this study, we used the waveforms of Hi-net, K-NET, KiK-net stations operated by the NIED for estimating structure and conducting ground motion prediction simulation. Part of this study was supported by the Earthquake Research Institute, the University of Tokyo cooperative research program and JSPS KAKENHI Grant Number 25282114.

  4. Predicting the spatial extent of liquefaction from geospatial and earthquake specific parameters

    USGS Publications Warehouse

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.

    2014-01-01

    The spatially extensive damage from the 2010-2011 Christchurch, New Zealand earthquake events are a reminder of the need for liquefaction hazard maps for anticipating damage from future earthquakes. Liquefaction hazard mapping as traditionally relied on detailed geologic mapping and expensive site studies. These traditional techniques are difficult to apply globally for rapid response or loss estimation. We have developed a logistic regression model to predict the probability of liquefaction occurrence in coastal sedimentary areas as a function of simple and globally available geospatial features (e.g., derived from digital elevation models) and standard earthquake-specific intensity data (e.g., peak ground acceleration). Some of the geospatial explanatory variables that we consider are taken from the hydrology community, which has a long tradition of using remotely sensed data as proxies for subsurface parameters. As a result of using high resolution, remotely-sensed, and spatially continuous data as a proxy for important subsurface parameters such as soil density and soil saturation, and by using a probabilistic modeling framework, our liquefaction model inherently includes the natural spatial variability of liquefaction occurrence and provides an estimate of spatial extent of liquefaction for a given earthquake. To provide a quantitative check on how the predicted probabilities relate to spatial extent of liquefaction, we report the frequency of observed liquefaction features within a range of predicted probabilities. The percentage of liquefaction is the areal extent of observed liquefaction within a given probability contour. The regional model and the results show that there is a strong relationship between the predicted probability and the observed percentage of liquefaction. Visual inspection of the probability contours for each event also indicates that the pattern of liquefaction is well represented by the model.

  5. Prospectively Evaluating the Collaboratory for the Study of Earthquake Predictability: An Evaluation of the UCERF2 and Updated Five-Year RELM Forecasts

    NASA Astrophysics Data System (ADS)

    Strader, Anne; Schneider, Max; Schorlemmer, Danijel; Liukis, Maria

    2016-04-01

    The Collaboratory for the Study of Earthquake Predictability (CSEP) was developed to rigorously test earthquake forecasts retrospectively and prospectively through reproducible, completely transparent experiments within a controlled environment (Zechar et al., 2010). During 2006-2011, thirteen five-year time-invariant prospective earthquake mainshock forecasts developed by the Regional Earthquake Likelihood Models (RELM) working group were evaluated through the CSEP testing center (Schorlemmer and Gerstenberger, 2007). The number, spatial, and magnitude components of the forecasts were compared to the respective observed seismicity components using a set of consistency tests (Schorlemmer et al., 2007, Zechar et al., 2010). In the initial experiment, all but three forecast models passed every test at the 95% significance level, with all forecasts displaying consistent log-likelihoods (L-test) and magnitude distributions (M-test) with the observed seismicity. In the ten-year RELM experiment update, we reevaluate these earthquake forecasts over an eight-year period from 2008-2016, to determine the consistency of previous likelihood testing results over longer time intervals. Additionally, we test the Uniform California Earthquake Rupture Forecast (UCERF2), developed by the U.S. Geological Survey (USGS), and the earthquake rate model developed by the California Geological Survey (CGS) and the USGS for the National Seismic Hazard Mapping Program (NSHMP) against the RELM forecasts. Both the UCERF2 and NSHMP forecasts pass all consistency tests, though the Helmstetter et al. (2007) and Shen et al. (2007) models exhibit greater information gain per earthquake according to the T- and W- tests (Rhoades et al., 2011). Though all but three RELM forecasts pass the spatial likelihood test (S-test), multiple forecasts fail the M-test due to overprediction of the number of earthquakes during the target period. Though there is no significant difference between the UCERF2 and NSHMP

  6. Earthquake prediction using extinct monogenetic volcanoes: A possible new research strategy

    NASA Astrophysics Data System (ADS)

    Szakács, Alexandru

    2011-04-01

    Volcanoes are extremely effective transmitters of matter, energy and information from the deep Earth towards its surface. Their capacities as information carriers are far to be fully exploited so far. Volcanic conduits can be viewed in general as rod-like or sheet-like vertical features with relatively homogenous composition and structure crosscutting geological structures of far more complexity and compositional heterogeneity. Information-carrying signals such as earthquake precursor signals originating deep below the Earth surface are transmitted with much less loss of information through homogenous vertically extended structures than through the horizontally segmented heterogeneous lithosphere or crust. Volcanic conduits can thus be viewed as upside-down "antennas" or waveguides which can be used as privileged pathways of any possible earthquake precursor signal. In particular, conduits of monogenetic volcanoes are promising transmitters of deep Earth information to be received and decoded at surface monitoring stations because the expected more homogenous nature of their rock-fill as compared to polygenetic volcanoes. Among monogenetic volcanoes those with dominantly effusive activity appear as the best candidates for privileged earthquake monitoring sites. In more details, effusive monogenetic volcanic conduits filled with rocks of primitive parental magma composition indicating direct ascent from sub-lithospheric magma-generating areas are the most suitable. Further selection criteria may include age of the volcanism considered and the presence of mantle xenoliths in surface volcanic products indicating direct and straightforward link between the deep lithospheric mantle and surface through the conduit. Innovative earthquake prediction research strategies can be based and developed on these grounds by considering conduits of selected extinct monogenetic volcanoes and deep trans-crustal fractures as privileged emplacement sites of seismic monitoring stations

  7. On a report that the 2012 M 6.0 earthquake in Italy was predicted after seeing an unusual cloud formation

    USGS Publications Warehouse

    Thomas, J.N.; Masci, F; Love, Jeffrey J.

    2015-01-01

    Several recently published reports have suggested that semi-stationary linear-cloud formations might be causally precursory to earthquakes. We examine the report of Guangmeng and Jie (2013), who claim to have predicted the 2012 M 6.0 earthquake in the Po Valley of northern Italy after seeing a satellite photograph (a digital image) showing a linear-cloud formation over the eastern Apennine Mountains of central Italy. From inspection of 4 years of satellite images we find numerous examples of linear-cloud formations over Italy. A simple test shows no obvious statistical relationship between the occurrence of these cloud formations and earthquakes that occurred in and around Italy. All of the linear-cloud formations we have identified in satellite images, including that which Guangmeng and Jie (2013) claim to have used to predict the 2012 earthquake, appear to be orographic – formed by the interaction of moisture-laden wind flowing over mountains. Guangmeng and Jie (2013) have not clearly stated how linear-cloud formations can be used to predict the size, location, and time of an earthquake, and they have not published an account of all of their predictions (including any unsuccessful predictions). We are skeptical of the validity of the claim by Guangmeng and Jie (2013) that they have managed to predict any earthquakes.

  8. Relating stick-slip friction experiments to earthquake source parameters

    USGS Publications Warehouse

    McGarr, Arthur F.

    2012-01-01

    Analytical results for parameters, such as static stress drop, for stick-slip friction experiments, with arbitrary input parameters, can be determined by solving an energy-balance equation. These results can then be related to a given earthquake based on its seismic moment and the maximum slip within its rupture zone, assuming that the rupture process entails the same physics as stick-slip friction. This analysis yields overshoots and ratios of apparent stress to static stress drop of about 0.25. The inferred earthquake source parameters static stress drop, apparent stress, slip rate, and radiated energy are robust inasmuch as they are largely independent of the experimental parameters used in their estimation. Instead, these earthquake parameters depend on C, the ratio of maximum slip to the cube root of the seismic moment. C is controlled by the normal stress applied to the rupture plane and the difference between the static and dynamic coefficients of friction. Estimating yield stress and seismic efficiency using the same procedure is only possible when the actual static and dynamic coefficients of friction are known within the earthquake rupture zone.

  9. Empirical prediction for travel distance of channelized rock avalanches in the Wenchuan earthquake area

    NASA Astrophysics Data System (ADS)

    Zhan, Weiwei; Fan, Xuanmei; Huang, Runqiu; Pei, Xiangjun; Xu, Qiang; Li, Weile

    2017-06-01

    Rock avalanches are extremely rapid, massive flow-like movements of fragmented rock. The travel path of the rock avalanches may be confined by channels in some cases, which are referred to as channelized rock avalanches. Channelized rock avalanches are potentially dangerous due to their difficult-to-predict travel distance. In this study, we constructed a dataset with detailed characteristic parameters of 38 channelized rock avalanches triggered by the 2008 Wenchuan earthquake using the visual interpretation of remote sensing imagery, field investigation and literature review. Based on this dataset, we assessed the influence of different factors on the runout distance and developed prediction models of the channelized rock avalanches using the multivariate regression method. The results suggested that the movement of channelized rock avalanche was dominated by the landslide volume, total relief and channel gradient. The performance of both models was then tested with an independent validation dataset of eight rock avalanches that were induced by the 2008 Wenchuan earthquake, the Ms 7.0 Lushan earthquake and heavy rainfall in 2013, showing acceptable good prediction results. Therefore, the travel-distance prediction models for channelized rock avalanches constructed in this study are applicable and reliable for predicting the runout of similar rock avalanches in other regions.

  10. Shaking table test and dynamic response prediction on an earthquake-damaged RC building

    NASA Astrophysics Data System (ADS)

    Xianguo, Ye; Jiaru, Qian; Kangning, Li

    2004-12-01

    This paper presents the results from shaking table tests of a one-tenth-scale reinforced concrete (RC) building model. The test model is a protype of a building that was seriously damaged during the 1985 Mexico earthquake. The input ground excitation used during the test was from the records obtained near the site of the prototype building during the 1985 and 1995 Mexico earthquakes. The tests showed that the damage pattern of the test model agreed well with that of the prototype building. Analytical prediction of earthquake response has been conducted for the prototype building using a sophisticated 3-D frame model. The input motion used for the dynamic analysis was the shaking table test measurements with similarity transformation. The comparison of the analytical results and the shaking table test results indicates that the response of the RC building to minor and the moderate earthquakes can be predicated well. However, there is difference between the predication and the actual response to the major earthquake.

  11. Understanding the disaster experience of older adults by gender: the experience of survivors of the 2007 earthquake in Peru.

    PubMed

    Shenk, Dena; Mahon, Joan; Kalaw, Karel J; Ramos, Blanca; Tufan, Ismail

    2010-11-01

    We examine the experiences of older adult survivors of the August 2007 "Southern earthquake" in Peru within the cultural context of gender roles and family relationships. The data include 24 semistructured videotaped interviews conducted in Pisco in December 2007 with survivors of the earthquake aged 60-90. The responses, experiences, and adjustments of the older adult disaster survivors will be discussed in terms of their family and social support systems and gender roles. These older adults sustain their personal identities and deal with their health concerns in the aftermath of the earthquake in the context of these cultural systems of support.

  12. An original approach to fill the gap in the earthquake disaster experience - a proposal for 'the archive of the quake experience' -

    NASA Astrophysics Data System (ADS)

    Tanaka, Y.; Hirayama, Y.; Kuroda, S.; Yoshida, M.

    2015-12-01

    People without severe disaster experience infallibly forget even the extraordinary one like 3.11 as time advances. Therefore, to improve the resilient society, an ingenious attempt to keep people's memory of disaster not to fade away is necessary. Since 2011, we have been caring out earthquake disaster drills for residents of high-rise apartments, for schoolchildren, for citizens of the coastal area, etc. Using a portable earthquake simulator (1), the drill consists of three parts, the first: a short lecture explaining characteristic quakes expected for Japanese people to have in the future, the second: reliving experience of major earthquakes hit Japan since 1995, and the third: a short lecture for preparation that can be done at home and/or in an office. For the quake experience, although it is two dimensional movement, the real earthquake observation record is used to control the simulator to provide people to relive an experience of different kinds of earthquake including the long period motion of skyscrapers. Feedback of the drill is always positive because participants understand that the reliving the quake experience with proper lectures is one of the best method to communicate the past disasters to their family and to inherit them to the next generation. There are several kinds of archive for disaster as inheritance such as pictures, movies, documents, interviews, and so on. In addition to them, here we propose to construct 'the archive of the quake experience' which compiles observed data ready to relive with the simulator. We would like to show some movies of our quake drill in the presentation. Reference: (1) Kuroda, S. et al. (2012), "Development of portable earthquake simulator for enlightenment of disaster preparedness", 15th World Conference on Earthquake Engineering 2012, Vol. 12, 9412-9420.

  13. Earthquake Knowledge and Experiences of Introductory Geology Students.

    ERIC Educational Resources Information Center

    Barrow, Lloyd; Haskins, Sandra

    1996-01-01

    Explores introductory geology students' (n=186) understanding of earthquakes. Results indicate that the mass media seem to provide students greater details about the cause and impact than the actual experience itself, students lack a broad understanding about the theory of plate tectonics, and introductory geology students have extensive…

  14. Real-time 3-D space numerical shake prediction for earthquake early warning

    NASA Astrophysics Data System (ADS)

    Wang, Tianyun; Jin, Xing; Huang, Yandan; Wei, Yongxiang

    2017-12-01

    In earthquake early warning systems, real-time shake prediction through wave propagation simulation is a promising approach. Compared with traditional methods, it does not suffer from the inaccurate estimation of source parameters. For computation efficiency, wave direction is assumed to propagate on the 2-D surface of the earth in these methods. In fact, since the seismic wave propagates in the 3-D sphere of the earth, the 2-D space modeling of wave direction results in inaccurate wave estimation. In this paper, we propose a 3-D space numerical shake prediction method, which simulates the wave propagation in 3-D space using radiative transfer theory, and incorporate data assimilation technique to estimate the distribution of wave energy. 2011 Tohoku earthquake is studied as an example to show the validity of the proposed model. 2-D space model and 3-D space model are compared in this article, and the prediction results show that numerical shake prediction based on 3-D space model can estimate the real-time ground motion precisely, and overprediction is alleviated when using 3-D space model.

  15. Current affairs in earthquake prediction in Japan

    NASA Astrophysics Data System (ADS)

    Uyeda, Seiya

    2015-12-01

    As of mid-2014, the main organizations of the earthquake (EQ hereafter) prediction program, including the Seismological Society of Japan (SSJ), the MEXT Headquarters for EQ Research Promotion, hold the official position that they neither can nor want to make any short-term prediction. It is an extraordinary stance of responsible authorities when the nation, after the devastating 2011 M9 Tohoku EQ, most urgently needs whatever information that may exist on forthcoming EQs. Japan's national project for EQ prediction started in 1965, but it has made no success. The main reason for no success is the failure to capture precursors. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this stance has been further fortified by the 2011 M9 Tohoku Mega-quake. This paper tries to explain how this situation came about and suggest that it may in fact be a legitimate one which should have come a long time ago. Actually, substantial positive changes are taking place now. Some promising signs are arising even from cooperation of researchers with private sectors and there is a move to establish an "EQ Prediction Society of Japan". From now on, maintaining the high scientific standards in EQ prediction will be of crucial importance.

  16. Earthquake Energy Dissipation in Light of High-Velocity, Slip-Pulse Shear Experiments

    NASA Astrophysics Data System (ADS)

    Reches, Z.; Liao, Z.; Chang, J. C.

    2014-12-01

    We investigated the energy dissipation during earthquakes by analysis of high-velocity shear experiments conducted on room-dry, solid samples of granite, tonalite, and dolomite sheared at slip-velocity of 0.0006-1m/s, and normal stress of 1-11.5MPa. The experimental fault were loaded in one of three modes: (1) Slip-pulse of abrupt, intense acceleration followed by moderate deceleration; (2) Impact by a spinning, heavy flywheel (225 kg); and (3) Constant velocity loading. We refer to energy dissipation in terms of power-density (PD=shear stress*slip-velocity; units of MW/m^2), and Coulomb-energy-density (CED= mechanical energy/normal stress; units of m). We present two aspects: Relative energy dissipation of the above loading modes, and relative energy dissipation between impact experiments and moderate earthquakes. For the first aspect, we used: (i) the lowest friction coefficient of the dynamic weakening; (ii) the work dissipated before reaching the lowest friction; and (iii) the cumulative mechanical work during the complete run. The results show that the slip-pulse/impact modes are energy efficient relatively to the constant-velocity mode as manifested by faster, more intense weakening and 50-90% lower energy dissipation. Thus, for a finite amount of pre-seismic crustal energy, the efficiency of slip-pulse would amplify earthquake instability. For the second aspect, we compare the experimental CED of the impact experiments to the reported breakdown energy (EG) of moderate earthquakes, Mw = 5.6 to 7.2 (Chang et al., 2012). In is commonly assumed that the seismic EG is a small fraction of the total earthquake energy, and as expected in 9 out of 11 examined earthquakes, EG was 0.005 to 0.07 of the experimental CED. We thus speculate that the experimental relation of Coulomb-energy-density to total slip distance, D, CED = 0.605 × D^0.933, is a reasonable estimate of total earthquake energy, a quantity that cannot be determined from seismic data.

  17. Prediction of Strong Earthquake Ground Motion for the M=7.4 and M=7.2 1999, Turkey Earthquakes based upon Geological Structure Modeling and Local Earthquake Recordings

    NASA Astrophysics Data System (ADS)

    Gok, R.; Hutchings, L.

    2004-05-01

    We test a means to predict strong ground motion using the Mw=7.4 and Mw=7.2 1999 Izmit and Duzce, Turkey earthquakes. We generate 100 rupture scenarios for each earthquake, constrained by a prior knowledge, and use these to synthesize strong ground motion and make the prediction. Ground motion is synthesized with the representation relation using impulsive point source Green's functions and synthetic source models. We synthesize the earthquakes from DC to 25 Hz. We demonstrate how to incorporate this approach into standard probabilistic seismic hazard analyses (PSHA). The synthesis of earthquakes is based upon analysis of over 3,000 aftershocks recorded by several seismic networks. The analysis provides source parameters of the aftershocks; records available for use as empirical Green's functions; and a three-dimensional velocity structure from tomographic inversion. The velocity model is linked to a finite difference wave propagation code (E3D, Larsen 1998) to generate synthetic Green's functions (DC < f < 0.5 Hz). We performed the simultaneous inversion for hypocenter locations and three-dimensional P-wave velocity structure of the Marmara region using SIMULPS14 along with 2,500 events. We also obtained source moment and corner frequency and individual station attenuation parameter estimates for over 500 events by performing a simultaneous inversion to fit these parameters with a Brune source model. We used the results of the source inversion to deconvolve out a Brune model from small to moderate size earthquake (M<4.0) recordings to obtain empirical Green's functions for the higher frequency range of ground motion (0.5 < f < 25.0 Hz). Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract W-7405-ENG-48.

  18. A Cooperative Test of the Load/Unload Response Ratio Proposed Method of Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Trotta, J. E.; Tullis, T. E.

    2004-12-01

    The Load/Unload Response Ratio (LURR) method is a proposed technique to predict earthquakes that was first put forward by Yin in 1984 (Yin, 1987). LURR is based on the idea that when a region is near failure, there is an increase in the rate of seismic activity during loading of the tidal cycle relative to the rate of seismic activity during unloading of the tidal cycle. Typically the numerator of the LURR ratio is the number, or the sum of some measure of the size (e.g. Benioff strain), of small earthquakes that occur during loading of the tidal cycle, whereas the denominator is the same as the numerator except it is calculated during unloading. LURR method suggests this ratio should increase in the months to year preceding a large earthquake. Regions near failure have tectonic stresses nearly high enough for a large earthquake to occur, thus it seems more likely that smaller earthquakes in the region would be triggered when the tidal stresses add to the tectonic ones. However, until recently even the most careful studies suggested that the effect of tidal stresses on earthquake occurrence is very small and difficult to detect. New studies have shown that there is a tidal triggering effect on shallow thrust faults in areas with strong tides from ocean loading (Tanaka et al., 2002; Cochran et al., 2004). We have been conducting an independent test of the LURR method, since there would be important scientific and social implications if the LURR method were proven to be a robust method of earthquake prediction. Smith and Sammis (2003) also undertook a similar study. Following both the parameters of Yin et al. (2000) and the somewhat different ones of Smith and Sammis (2003), we have repeated calculations of LURR for the Northridge and Loma Prieta earthquakes in California. Though we have followed both sets of parameters closely, we have been unable to reproduce either set of results. A general agreement was made at the recent ACES Workshop in China between research

  19. The susceptibility analysis of landslides induced by earthquake in Aso volcanic area, Japan, scoping the prediction

    NASA Astrophysics Data System (ADS)

    Kubota, Tetsuya; Takeda, Tsuyoshi

    2017-04-01

    Kumamoto earthquake on April 16th 2016 in Kumamoto prefecture, Kyushu Island, Japan with intense seismic scale of M7.3 (maximum acceleration = 1316 gal in Aso volcanic region) yielded countless instances of landslide and debris flow that induced serious damages and causalities in the area, especially in the Aso volcanic mountain range. Hence, field investigation and numerical slope stability analysis were conducted to delve into the characteristics or the prediction factors of the landslides induced by this earthquake. For the numerical analysis, Finite Element Method (FEM) and CSSDP (Critical Slip Surface analysis by Dynamic Programming theory based on limit equilibrium method) were applied to the landslide slopes with seismic acceleration observed. These numerical analysis methods can automatically detect the landslide slip surface which has minimum Fs (factor of safety). The various results and the information obtained through this investigation and analysis were integrated to predict the landslide susceptible slopes in volcanic area induced by earthquakes and rainfalls of their aftermath, considering geologic-geomorphologic features, geo-technical characteristics of the landslides and vegetation effects on the slope stability. Based on the FEM or CSSDP results, the landslides occurred in this earthquake at the mild gradient slope on the ridge have the safety factor of slope Fs=2.20 approximately (without rainfall nor earthquake, and Fs>=1.0 corresponds to stable slope without landslide) and 1.78 2.10 (with the most severe rainfall in the past) while they have approximately Fs=0.40 with the seismic forces in this earthquake (horizontal direction 818 gal, vertical direction -320 gal respectively, observed in the earthquake). It insists that only in case of earthquakes the landslide in volcanic sediment apt to occur at the mild gradient slopes as well as on the ridges with convex cross section. Consequently, the following results are obtained. 1) At volcanic

  20. Space-Time Earthquake Prediction: The Error Diagrams

    NASA Astrophysics Data System (ADS)

    Molchan, G.

    2010-08-01

    The quality of earthquake prediction is usually characterized by a two-dimensional diagram n versus τ, where n is the rate of failures-to-predict and τ is a characteristic of space-time alarm. Unlike the time prediction case, the quantity τ is not defined uniquely. We start from the case in which τ is a vector with components related to the local alarm times and find a simple structure of the space-time diagram in terms of local time diagrams. This key result is used to analyze the usual 2-d error sets { n, τ w } in which τ w is a weighted mean of the τ components and w is the weight vector. We suggest a simple algorithm to find the ( n, τ w ) representation of all random guess strategies, the set D, and prove that there exists the unique case of w when D degenerates to the diagonal n + τ w = 1. We find also a confidence zone of D on the ( n, τ w ) plane when the local target rates are known roughly. These facts are important for correct interpretation of ( n, τ w ) diagrams when we discuss the prediction capability of the data or prediction methods.

  1. Fluid Induced Earthquakes: From KTB Experiments to Natural Seismicity Swarms.

    NASA Astrophysics Data System (ADS)

    Shapiro, S. A.

    2006-12-01

    Experiments with borehole fluid injections are typical for exploration and development of hydrocarbon or geothermal reservoirs (e.g., fluid-injection experiments at Soultz, France and at Fenton-Hill, USA). Microseismicity occurring during such operations has a large potential for understanding physics of the seismogenic process as well as for obtaining detailed information about reservoirs at locations as far as several kilometers from boreholes. The phenomenon of microseismicity triggering by borehole fluid injections is related to the process of the Frenkel-Biot slow wave propagation. In the low-frequency range (hours or days of fluid injection duration) this process reduces to the pore pressure diffusion. Fluid induced seismicity typically shows several diffusion indicating features, which are directly related to the rate of spatial grow, to the geometry of clouds of micro earthquake hypocentres and to their spatial density. Several fluid injection experiments were conducted at the German Continental Deep Drilling Site (KTB) in 1994, 2000 and 2003-2005. Microseismicity occurred at different depth intervals. We analyze this microseismicity in terms of its diffusion-related features. Its relation to the 3-D distribution of the seismic reflectivity has important rock physical and tectonic implications. Starting from such diffusion-typical signatures of man-made earthquakes, we seek analogous patterns for the earthquakes in Vogtland/Bohemia at the German/Czech border region in central Europe. There is strong geophysical evidence that there seismic events are correlated to fluid-related processes in the crust. We test the hypothesis that ascending magmatic fluids trigger earthquakes by the mechanism of pore pressure diffusion. This triggering process is mainly controlled by two physical fields, the hydraulic diffusivity and the seismic criticality (i.e., critical pore pressure value leading to failure; stable locations are characterized by higher critical pressures

  2. A global earthquake discrimination scheme to optimize ground-motion prediction equation selection

    USGS Publications Warehouse

    Garcia, Daniel; Wald, David J.; Hearne, Michael

    2012-01-01

    We present a new automatic earthquake discrimination procedure to determine in near-real time the tectonic regime and seismotectonic domain of an earthquake, its most likely source type, and the corresponding ground-motion prediction equation (GMPE) class to be used in the U.S. Geological Survey (USGS) Global ShakeMap system. This method makes use of the Flinn–Engdahl regionalization scheme, seismotectonic information (plate boundaries, global geology, seismicity catalogs, and regional and local studies), and the source parameters available from the USGS National Earthquake Information Center in the minutes following an earthquake to give the best estimation of the setting and mechanism of the event. Depending on the tectonic setting, additional criteria based on hypocentral depth, style of faulting, and regional seismicity may be applied. For subduction zones, these criteria include the use of focal mechanism information and detailed interface models to discriminate among outer-rise, upper-plate, interface, and intraslab seismicity. The scheme is validated against a large database of recent historical earthquakes. Though developed to assess GMPE selection in Global ShakeMap operations, we anticipate a variety of uses for this strategy, from real-time processing systems to any analysis involving tectonic classification of sources from seismic catalogs.

  3. Predictability of Landslide Timing From Quasi-Periodic Precursory Earthquakes

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.

    2018-02-01

    Accelerating rates of geophysical signals are observed before a range of material failure phenomena. They provide insights into the physical processes controlling failure and the basis for failure forecasts. However, examples of accelerating seismicity before landslides are rare, and their behavior and forecasting potential are largely unknown. Here I use a Bayesian methodology to apply a novel gamma point process model to investigate a sequence of quasiperiodic repeating earthquakes preceding a large landslide at Nuugaatsiaq in Greenland in June 2017. The evolution in earthquake rate is best explained by an inverse power law increase with time toward failure, as predicted by material failure theory. However, the commonly accepted power law exponent value of 1.0 is inconsistent with the data. Instead, the mean posterior value of 0.71 indicates a particularly rapid acceleration toward failure and suggests that only relatively short warning times may be possible for similar landslides in future.

  4. The Virtual Quake earthquake simulator: a simulation-based forecast of the El Mayor-Cucapah region and evidence of predictability in simulated earthquake sequences

    NASA Astrophysics Data System (ADS)

    Yoder, Mark R.; Schultz, Kasey W.; Heien, Eric M.; Rundle, John B.; Turcotte, Donald L.; Parker, Jay W.; Donnellan, Andrea

    2015-12-01

    In this manuscript, we introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert-based forecasting metric, and show that it exhibits significant information gain compared to random forecasts. We also discuss the long-standing question of activation versus quiescent type earthquake triggering. We show that VQ exhibits both behaviours separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California, USA and northern Baja California Norte, Mexico.

  5. Earthquake prediction analysis based on empirical seismic rate: the M8 algorithm

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.

    2010-12-01

    The quality of space-time earthquake prediction is usually characterized by a 2-D error diagram (n, τ), where n is the fraction of failures-to-predict and τ is the local rate of alarm averaged in space. The most reasonable averaging measure for analysis of a prediction strategy is the normalized rate of target events λ(dg) in a subarea dg. In that case the quantity H = 1 - (n + τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n, τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M >= 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw >= 5.5, 1977-2004, and the magnitude range of target events 8.0 <= M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm.

  6. Gambling score in earthquake prediction analysis

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.

    2011-03-01

    The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

  7. Report of the International Commission on Earthquake Forecasting for Civil Protection (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2009-12-01

    The destructive L’Aquila earthquake of 6 April 2009 (Mw 6.3) illustrates the challenges of operational earthquake forecasting. The earthquake ruptured a mapped normal fault in a region identified by long-term forecasting models as one of the most seismically dangerous in Italy; it was the strongest of a rich sequence that started several months earlier and included a M3.9 foreshock less than five hours prior to the mainshock. According to widely circulated news reports, the earthquake had been predicted by a local resident using unpublished radon-based techniques, provoking a public controversy prior to the event that intensified in its wake. Several weeks after the earthquake, the Italian Department of Civil Protection appointed an international commission with the mandate to report on the current state of knowledge of prediction and forecasting and guidelines for operational utilization. The commission included geoscientists from China, France, Germany, Greece, Italy, Japan, Russia, United Kingdom, and United States with experience in earthquake forecasting and prediction. This presentation by the chair of the commission will report on its findings and recommendations.

  8. Spatio-Temporal Fluctuations of the Earthquake Magnitude Distribution: Robust Estimation and Predictive Power

    NASA Astrophysics Data System (ADS)

    Olsen, S.; Zaliapin, I.

    2008-12-01

    We establish positive correlation between the local spatio-temporal fluctuations of the earthquake magnitude distribution and the occurrence of regional earthquakes. In order to accomplish this goal, we develop a sequential Bayesian statistical estimation framework for the b-value (slope of the Gutenberg-Richter's exponential approximation to the observed magnitude distribution) and for the ratio a(t) between the earthquake intensities in two non-overlapping magnitude intervals. The time-dependent dynamics of these parameters is analyzed using Markov Chain Models (MCM). The main advantage of this approach over the traditional window-based estimation is its "soft" parameterization, which allows one to obtain stable results with realistically small samples. We furthermore discuss a statistical methodology for establishing lagged correlations between continuous and point processes. The developed methods are applied to the observed seismicity of California, Nevada, and Japan on different temporal and spatial scales. We report an oscillatory dynamics of the estimated parameters, and find that the detected oscillations are positively correlated with the occurrence of large regional earthquakes, as well as with small events with magnitudes as low as 2.5. The reported results have important implications for further development of earthquake prediction and seismic hazard assessment methods.

  9. Predicted Attenuation Relation and Observed Ground Motion of Gorkha Nepal Earthquake of 25 April 2015

    NASA Astrophysics Data System (ADS)

    Singh, R. P.; Ahmad, R.

    2015-12-01

    A comparison of recent observed ground motion parameters of recent Gorkha Nepal earthquake of 25 April 2015 (Mw 7.8) with the predicted ground motion parameters using exitsing attenuation relation of the Himalayan region will be presented. The recent earthquake took about 8000 lives and destroyed thousands of poor quality of buildings and the earthquake was felt by millions of people living in Nepal, China, India, Bangladesh, and Bhutan. The knowledge of ground parameters are very important in developing seismic code of seismic prone regions like Himalaya for better design of buildings. The ground parameters recorded in recent earthquake event and aftershocks are compared with attenuation relations for the Himalayan region, the predicted ground motion parameters show good correlation with the observed ground parameters. The results will be of great use to Civil engineers in updating existing building codes in the Himlayan and surrounding regions and also for the evaluation of seismic hazards. The results clearly show that the attenuation relation developed for the Himalayan region should be only used, other attenuation relations based on other regions fail to provide good estimate of observed ground motion parameters.

  10. Nurse willingness to report for work in the event of an earthquake in Israel.

    PubMed

    Ben Natan, Merav; Nigel, Simon; Yevdayev, Innush; Qadan, Mohamad; Dudkiewicz, Mickey

    2014-10-01

    To examine variables affecting nurse willingness to report for work in the event of an earthquake in Israel and whether this can be predicted through the Theory of Self-Efficacy. The nursing profession has a major role in preparing for earthquakes. Nurse willingness to report to work in the event of an earthquake has never before been examined. Self-administered questionnaires were distributed among a convenience sample of 400 nurses and nursing students in Israel during January-April 2012. High willingness to report to work in the event of an earthquake was declared by 57% of respondents. High perceived self-efficacy, level of knowledge and experience predict willingness to report to work in the event of an earthquake. Multidisciplinary collaboration and support was also cited as a meaningful factor. Perceived self-efficacy, level of knowledge, experience and the support of a multidisciplinary staff affect nurse willingness to report to work in the event of an earthquake. Nurse managers can identify factors that increase nurse willingness to report to work in the event of an earthquake and consequently develop strategies for more efficient management of their nursing workforce. © 2013 John Wiley & Sons Ltd.

  11. Predicted liquefaction of East Bay fills during a repeat of the 1906 San Francisco earthquake

    USGS Publications Warehouse

    Holzer, T.L.; Blair, J.L.; Noce, T.E.; Bennett, M.J.

    2006-01-01

    Predicted conditional probabilities of surface manifestations of liquefaction during a repeat of the 1906 San Francisco (M7.8) earthquake range from 0.54 to 0.79 in the area underlain by the sandy artificial fills along the eastern shore of San Francisco Bay near Oakland, California. Despite widespread liquefaction in 1906 of sandy fills in San Francisco, most of the East Bay fills were emplaced after 1906 without soil improvement to increase their liquefaction resistance. They have yet to be shaken strongly. Probabilities are based on the liquefaction potential index computed from 82 CPT soundings using median (50th percentile) estimates of PGA based on a ground-motion prediction equation. Shaking estimates consider both distance from the San Andreas Fault and local site conditions. The high probabilities indicate extensive and damaging liquefaction will occur in East Bay fills during the next M ??? 7.8 earthquake on the northern San Andreas Fault. ?? 2006, Earthquake Engineering Research Institute.

  12. Earthquake prediction in California using regression algorithms and cloud-based big data infrastructure

    NASA Astrophysics Data System (ADS)

    Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F.

    2018-06-01

    Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results.

  13. Satellite relay telemetry of seismic data in earthquake prediction and control

    USGS Publications Warehouse

    Jackson, Wayne H.; Eaton, Jerry P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started in FY 1968 to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research (NCER) in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project.

  14. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  15. Fault healing promotes high-frequency earthquakes in laboratory experiments and on natural faults

    USGS Publications Warehouse

    McLaskey, Gregory C.; Thomas, Amanda M.; Glaser, Steven D.; Nadeau, Robert M.

    2012-01-01

    Faults strengthen or heal with time in stationary contact and this healing may be an essential ingredient for the generation of earthquakes. In the laboratory, healing is thought to be the result of thermally activated mechanisms that weld together micrometre-sized asperity contacts on the fault surface, but the relationship between laboratory measures of fault healing and the seismically observable properties of earthquakes is at present not well defined. Here we report on laboratory experiments and seismological observations that show how the spectral properties of earthquakes vary as a function of fault healing time. In the laboratory, we find that increased healing causes a disproportionately large amount of high-frequency seismic radiation to be produced during fault rupture. We observe a similar connection between earthquake spectra and recurrence time for repeating earthquake sequences on natural faults. Healing rates depend on pressure, temperature and mineralogy, so the connection between seismicity and healing may help to explain recent observations of large megathrust earthquakes which indicate that energetic, high-frequency seismic radiation originates from locations that are distinct from the geodetically inferred locations of large-amplitude fault slip

  16. Predictive factors of depression symptoms among adolescents in the 18-month follow-up after Wenchuan earthquake in China.

    PubMed

    Chui, Cheryl H K; Ran, Mao-Sheng; Li, Rong-Hui; Fan, Mei; Zhang, Zhen; Li, Yuan-Hao; Ou, Guo Jing; Jiang, Zhe; Tong, Yu-Zhen; Fang, Ding-Zhi

    2017-02-01

    It is unclear about the change and risk factors of depression among adolescent survivors after earthquake. This study aimed to explore the change of depression, and identify the predictive factors of depression among adolescent survivors after the 2008 Wenchuan earthquake in China. The depression among high school students at 6, 12 and 18 months after the Wenchuan earthquake were investigated. The Beck Depression Inventory (BDI) was used in this study to assess the severity of depression. Subjects included 548 student survivors in an affected high school. The rates of depression among the adolescent survivors at 6-, 12- and 18-month after the earthquake were 27.3%, 42.9% and 33.3%, respectively, for males, and 42.9%, 61.9% and 53.4%, respectively, for females. Depression symptoms, trauma-related self-injury, suicidal ideation and PTSD symptoms at the 6-month follow-up were significant predictive factors for depression at the 18-month time interval following the earthquake. This study highlights the need for considering disaster-related psychological sequela and risk factors of depression symptoms in the planning and implementation of mental health services. Long-term mental and psychological supports for victims of natural disasters are imperative.

  17. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  18. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  19. Combining multiple earthquake models in real time for earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Wu, Stephen; Beck, James L; Heaton, Thomas H.

    2017-01-01

    The ultimate goal of earthquake early warning (EEW) is to provide local shaking information to users before the strong shaking from an earthquake reaches their location. This is accomplished by operating one or more real‐time analyses that attempt to predict shaking intensity, often by estimating the earthquake’s location and magnitude and then predicting the ground motion from that point source. Other EEW algorithms use finite rupture models or may directly estimate ground motion without first solving for an earthquake source. EEW performance could be improved if the information from these diverse and independent prediction models could be combined into one unified, ground‐motion prediction. In this article, we set the forecast shaking at each location as the common ground to combine all these predictions and introduce a Bayesian approach to creating better ground‐motion predictions. We also describe how this methodology could be used to build a new generation of EEW systems that provide optimal decisions customized for each user based on the user’s individual false‐alarm tolerance and the time necessary for that user to react.

  20. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  1. Satellite Relay Telemetry of Seismic Data in Earthquake Prediction and Control

    NASA Technical Reports Server (NTRS)

    Jackson, W. H.; Eaton, J. P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project. The principal advantages of the satellite relay system over commercial telephone or microwave systems were: (1) it could be made less prone to massive failure during a major earthquake; (2) it could be extended readily into undeveloped regions; and (3) it could provide flexible, uniform communications over large sections of major global tectonic zones. Fundamental characteristics of a communications system to cope with the large volume of raw data collected by a short-period seismograph network are discussed.

  2. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.

  3. Predicting earthquake effects—Learning from Northridge and Loma Prieta

    USGS Publications Warehouse

    Holzer, Thomas L.

    1994-01-01

    The continental United States has been rocked by two particularly damaging earthquakes in the last 4.5 years, Loma Prieta in northern California in 1989 and Northridge in southern California in 1994. Combined losses from these two earthquakes approached $30 billion. Approximately half these losses were reimbursed by the federal government. Because large earthquakes typically overwhelm state resources and place unplanned burdens on the federal government, it is important to learn from these earthquakes how to reduce future losses. My purpose here is to explore a potential implication of the Northridge and Loma Prieta earthquakes for hazard-mitigation strategies: earth scientists should increase their efforts to map hazardous areas within urban regions. 

  4. Prediction of the area affected by earthquake-induced landsliding based on seismological parameters

    NASA Astrophysics Data System (ADS)

    Marc, Odin; Meunier, Patrick; Hovius, Niels

    2017-07-01

    We present an analytical, seismologically consistent expression for the surface area of the region within which most landslides triggered by an earthquake are located (landslide distribution area). This expression is based on scaling laws relating seismic moment, source depth, and focal mechanism with ground shaking and fault rupture length and assumes a globally constant threshold of acceleration for onset of systematic mass wasting. The seismological assumptions are identical to those recently used to propose a seismologically consistent expression for the total volume and area of landslides triggered by an earthquake. To test the accuracy of the model we gathered geophysical information and estimates of the landslide distribution area for 83 earthquakes. To reduce uncertainties and inconsistencies in the estimation of the landslide distribution area, we propose an objective definition based on the shortest distance from the seismic wave emission line containing 95 % of the total landslide area. Without any empirical calibration the model explains 56 % of the variance in our dataset, and predicts 35 to 49 out of 83 cases within a factor of 2, depending on how we account for uncertainties on the seismic source depth. For most cases with comprehensive landslide inventories we show that our prediction compares well with the smallest region around the fault containing 95 % of the total landslide area. Aspects ignored by the model that could explain the residuals include local variations of the threshold of acceleration and processes modulating the surface ground shaking, such as the distribution of seismic energy release on the fault plane, the dynamic stress drop, and rupture directivity. Nevertheless, its simplicity and first-order accuracy suggest that the model can yield plausible and useful estimates of the landslide distribution area in near-real time, with earthquake parameters issued by standard detection routines.

  5. Predicting earthquakes by analyzing accelerating precursory seismic activity

    USGS Publications Warehouse

    Varnes, D.J.

    1989-01-01

    During 11 sequences of earthquakes that in retrospect can be classed as foreshocks, the accelerating rate at which seismic moment is released follows, at least in part, a simple equation. This equation (1) is {Mathematical expression},where {Mathematical expression} is the cumulative sum until time, t, of the square roots of seismic moments of individual foreshocks computed from reported magnitudes;C and n are constants; and tfis a limiting time at which the rate of seismic moment accumulation becomes infinite. The possible time of a major foreshock or main shock, tf,is found by the best fit of equation (1), or its integral, to step-like plots of {Mathematical expression} versus time using successive estimates of tfin linearized regressions until the maximum coefficient of determination, r2,is obtained. Analyzed examples include sequences preceding earthquakes at Cremasta, Greece, 2/5/66; Haicheng, China 2/4/75; Oaxaca, Mexico, 11/29/78; Petatlan, Mexico, 3/14/79; and Central Chile, 3/3/85. In 29 estimates of main-shock time, made as the sequences developed, the errors in 20 were less than one-half and in 9 less than one tenth the time remaining between the time of the last data used and the main shock. Some precursory sequences, or parts of them, yield no solution. Two sequences appear to include in their first parts the aftershocks of a previous event; plots using the integral of equation (1) show that the sequences are easily separable into aftershock and foreshock segments. Synthetic seismic sequences of shocks at equal time intervals were constructed to follow equation (1), using four values of n. In each series the resulting distributions of magnitudes closely follow the linear Gutenberg-Richter relation log N=a-bM, and the product n times b for each series is the same constant. In various forms and for decades, equation (1) has been used successfully to predict failure times of stressed metals and ceramics, landslides in soil and rock slopes, and volcanic

  6. [Clinical characteristics of pediatric victims in the Lushan and Wenchuan earthquakes and experience of medical rescue].

    PubMed

    Jiang, Xin; Xiang, Bo; Liu, Li-Jun; Liu, Min; Tang, Xue-Yang; Huang, Lu-Gang; Li, Yuan; Peng, Ming-Xing; Xin, Wen-Qiong

    2013-06-01

    To get a more comprehensive understanding of the clinical characteristics of pediatric victims in earthquake and to summarize the experience of medical rescue. The clinical information was collected from the pediatric victims who were admitted to West China Hospital, Sichuan University following the Lushan earthquake in 2013 and Wenchuan earthquake in 2008. The clinical data were compared between the pediatric victims in the two earthquakes. Thirty-four children under 14 years of age, who were injured in the Lushan earthquake, were admitted to the West China Hospital before April 30, 2013. Compared with the data in the Wenchuan earthquake, the mean age of the pediatric victims in the Lushan earthquake was significantly lower (P<0.01), and the mean time from earthquake to hospitalization was significantly shorter (P<0.01). In the Lushan earthquake, 67.6% of the injured children had variable limb fractures; traumatic brain injury was found in 29.4% of hospitalized children, versus 9.5% in the Wenchuan earthquake (P<0.05). Among the 34 children, no amputation and death occurred, and all the 13 severe cases started to recover. There were higher proportions of severely injured children and children with traumatic brain injury in the Lushan earthquake than in the Wenchuan earthquake. But these cases recovered well, which was possibly due to timely on-site rescue and transfer and multi-sector, multi-institution, and multidisciplinary cooperation.

  7. Purposes and methods of scoring earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2010-12-01

    There are two kinds of purposes in the studies on earthquake prediction or forecasts: one is to give a systematic estimation of earthquake risks in some particular region and period in order to give advice to governments and enterprises for the use of reducing disasters, the other one is to search for reliable precursors that can be used to improve earthquake prediction or forecasts. For the first case, a complete score is necessary, while for the latter case, a partial score, which can be used to evaluate whether the forecasts or predictions have some advantages than a well know model, is necessary. This study reviews different scoring methods for evaluating the performance of earthquake prediction and forecasts. Especially, the gambling scoring method, which is developed recently, shows its capacity in finding good points in an earthquake prediction algorithm or model that are not in a reference model, even if its overall performance is no better than the reference model.

  8. Conditional spectrum computation incorporating multiple causal earthquakes and ground-motion prediction models

    USGS Publications Warehouse

    Lin, Ting; Harmsen, Stephen C.; Baker, Jack W.; Luco, Nicolas

    2013-01-01

    The conditional spectrum (CS) is a target spectrum (with conditional mean and conditional standard deviation) that links seismic hazard information with ground-motion selection for nonlinear dynamic analysis. Probabilistic seismic hazard analysis (PSHA) estimates the ground-motion hazard by incorporating the aleatory uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties in ground-motion prediction models (GMPMs) and seismic source models. Typical CS calculations to date are produced for a single earthquake scenario using a single GMPM, but more precise use requires consideration of at least multiple causal earthquakes and multiple GMPMs that are often considered in a PSHA computation. This paper presents the mathematics underlying these more precise CS calculations. Despite requiring more effort to compute than approximate calculations using a single causal earthquake and GMPM, the proposed approach produces an exact output that has a theoretical basis. To demonstrate the results of this approach and compare the exact and approximate calculations, several example calculations are performed for real sites in the western United States. The results also provide some insights regarding the circumstances under which approximate results are likely to closely match more exact results. To facilitate these more precise calculations for real applications, the exact CS calculations can now be performed for real sites in the United States using new deaggregation features in the U.S. Geological Survey hazard mapping tools. Details regarding this implementation are discussed in this paper.

  9. Slow Slip and Earthquake Nucleation in Meter-Scale Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Mclaskey, G.

    2017-12-01

    The initiation of dynamic rupture is thought to be preceded by a quasistatic nucleation phase. Observations of recent earthquakes sometimes support this by illuminating slow slip and foreshocks in the vicinity of the eventual hypocenter. I describe laboratory earthquake experiments conducted on two large-scale loading machines at Cornell University that provide insight into the way earthquake nucleation varies with normal stress, healing time, and loading rate. The larger of the two machines accommodates a 3 m long granite sample, and when loaded to 7 MPa stress levels, we observe dynamic rupture events that are preceded by a measureable nucleation zone with dimensions on the order of 1 m. The smaller machine accommodates a 0.76 m sample that is roughly the same size as the nucleation zone. On this machine, small variations in nucleation properties result in measurable differences in slip events, and we generate both dynamic rupture events (> 0.1 m/s slip rates) and slow slip events ( 0.001 to 30 mm/s slip rates). Slow events occur when instability cannot fully nucleate before reaching the sample ends. Dynamic events occur after long healing times or abrupt increases in loading rate which suggests that these factors shrink the spatial and temporal extents of the nucleation zone. Arrays of slip, strain, and ground motion sensors installed on the sample allow us to quantify seismic coupling and study details of premonitory slip and afterslip. The slow slip events we observe are primarily aseismic (less than 1% of the seismic coupling of faster events) and produce swarms of very small M -6 to M -8 events. These mechanical and seismic interactions suggest that faults with transitional behavior—where creep, small earthquakes, and tremor are often observed—could become seismically coupled if loaded rapidly, either by a slow slip front or dynamic rupture of an earthquake that nucleated elsewhere.

  10. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  11. Laboratory constraints on models of earthquake recurrence

    NASA Astrophysics Data System (ADS)

    Beeler, N. M.; Tullis, Terry; Junger, Jenni; Kilgore, Brian; Goldsby, David

    2014-12-01

    In this study, rock friction "stick-slip" experiments are used to develop constraints on models of earthquake recurrence. Constant rate loading of bare rock surfaces in high-quality experiments produces stick-slip recurrence that is periodic at least to second order. When the loading rate is varied, recurrence is approximately inversely proportional to loading rate. These laboratory events initiate due to a slip-rate-dependent process that also determines the size of the stress drop and, as a consequence, stress drop varies weakly but systematically with loading rate. This is especially evident in experiments where the loading rate is changed by orders of magnitude, as is thought to be the loading condition of naturally occurring, small repeating earthquakes driven by afterslip, or low-frequency earthquakes loaded by episodic slip. The experimentally observed stress drops are well described by a logarithmic dependence on recurrence interval that can be cast as a nonlinear slip predictable model. The fault's rate dependence of strength is the key physical parameter. Additionally, even at constant loading rate the most reproducible laboratory recurrence is not exactly periodic, unlike existing friction recurrence models. We present example laboratory catalogs that document the variance and show that in large catalogs, even at constant loading rate, stress drop and recurrence covary systematically. The origin of this covariance is largely consistent with variability of the dependence of fault strength on slip rate. Laboratory catalogs show aspects of both slip and time predictability, and successive stress drops are strongly correlated indicating a "memory" of prior slip history that extends over at least one recurrence cycle.

  12. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  13. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    USGS Publications Warehouse

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  14. Comparison of Ground Motion Prediction Equations (GMPE) for Chile and Canada With Recent Chilean Megathust Earthquakes

    NASA Astrophysics Data System (ADS)

    Herrera, C.; Cassidy, J. F.; Dosso, S. E.

    2017-12-01

    The ground shaking assessment allows quantifying the hazards associated with the occurrence of earthquakes. Chile and western Canada are two areas that have experienced, and are susceptible to imminent large crustal, in-slab and megathrust earthquakes that can affect the population significantly. In this context, we compare the current GMPEs used in the 2015 National Building Code of Canada and the most recent GMPEs calculated for Chile, with observed accelerations generated by four recent Chilean megathrust earthquakes (MW ≥ 7.7) that have occurred during the past decade, which is essential to quantify how well current models predict observations of major events.We collected the 3-component waveform data of more than 90 stations from the Centro Sismologico Nacional and the Universidad de Chile, and processed them by removing the trend and applying a band-pass filter. Then, for each station, we obtained the Peak Ground Acceleration (PGA), and by using a damped response spectra, we calculated the Pseudo Spectral Acceleration (PSA). Finally, we compared those observations with the most recent Chilean and Canadian GMPEs. Given the lack of geotechnical information for most of the Chilean stations, we also used a new method to obtain the VS30 by inverting the H/V ratios using a trans-dimensional Bayesian inversion, which allows us to improve the correction of observations according to soil conditions.As expected, our results show a good fit between observations and the Chilean GMPEs, but we observe that although the shape of the Canadian GMPEs is coherent with the distribution of observations, in general they under predict the observations for PGA and PSA at shorter periods for most of the considered earthquakes. An example of this can be seen in the attached figure for the case of the 2014 Iquique earthquake.These results present important implications related to the hazards associated to large earthquakes, especially for western Canada, where the probability of a

  15. Earthquake Nucleation and Fault Slip: Possible Experiments on a Natural Fault

    NASA Astrophysics Data System (ADS)

    Germanovich, L. N.; Murdoch, L. C.; Garagash, D.; Reches, Z.; Martel, S. J.; Johnston, M. J.; Ebenhack, J.; Gwaba, D.

    2011-12-01

    High-resolution deformation and seismic observations are usually made only near the Earths' surface, kilometers away from where earthquake nucleate on active faults and are limited by inverse-cube-distance attenuation and ground noise. We have developed an experimental approach that aims at reactivating faults in-situ using thermal techniques and fluid injection, which modify in-situ stresses and the fault strength until the fault slips. Mines where in-situ stresses are sufficient to drive faulting present an opportunity to conduct such experiments. The former Homestake gold mine in South Dakota is a good example. During our recent field work in the Homestake mine, we found a large fault that intersects multiple mine levels. The size and distinct structure of this fault make it a promising target for in-situ reactivation, which would likely to be localized on a crack-like patch. Slow patch propagation, moderated by the injection rate and the rate of change of the background stresses, may become unstable, leading to the nucleation of a dynamic earthquake rupture. Our analyses for the Homestake fault conditions indicate that this transition occurs for a patch size ~1 m. This represents a fundamental limitation for laboratory experiments and necessitates larger-scale field tests ~10-100 m. The opportunity to observe earthquake nucleation on the Homestake Fault is feasible because slip could be initiated at a pre-defined location and time with instrumentation placed as close as a few meters from the nucleation site. Designing the experiment requires a detailed assessment of the state-of-stress in the vicinity of the fault. This is being conducted by simulating changes in pore pressure and effective stresses accompanying dewatering of the mine, and by evaluating in-situ stress measurements in light of a regional stress field modified by local perturbations caused by the mine workings.

  16. An integrated earthquake early warning system and its performance at schools in Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Bing-Ru; Hsiao, Nai-Chi; Lin, Pei-Yang; Hsu, Ting-Yu; Chen, Chiou-Yun; Huang, Shieh-Kung; Chiang, Hung-Wei

    2017-01-01

    An earthquake early warning (EEW) system with integration of regional and onsite approaches was installed at nine demonstration stations in several districts of Taiwan for taking advantages of both approaches. The system performance was evaluated by a 3-year experiment at schools, which experienced five major earthquakes during this period. The blind zone of warning was effectively reduced by the integrated EEW system. The predicted intensities from EEW demonstration stations showed acceptable accuracy compared to field observations. The operation experience from an earthquake event proved that students could calmly carry out correct action before the seismic wave arrived using some warning time provided by the EEW system. Through successful operation in practice, the integrated EEW system was verified as an effective tool for disaster prevention at schools.

  17. Experimental validation of finite element model analysis of a steel frame in simulated post-earthquake fire environments

    NASA Astrophysics Data System (ADS)

    Huang, Ying; Bevans, W. J.; Xiao, Hai; Zhou, Zhi; Chen, Genda

    2012-04-01

    During or after an earthquake event, building system often experiences large strains due to shaking effects as observed during recent earthquakes, causing permanent inelastic deformation. In addition to the inelastic deformation induced by the earthquake effect, the post-earthquake fires associated with short fuse of electrical systems and leakage of gas devices can further strain the already damaged structures during the earthquakes, potentially leading to a progressive collapse of buildings. Under these harsh environments, measurements on the involved building by various sensors could only provide limited structural health information. Finite element model analysis, on the other hand, if validated by predesigned experiments, can provide detail structural behavior information of the entire structures. In this paper, a temperature dependent nonlinear 3-D finite element model (FEM) of a one-story steel frame is set up by ABAQUS based on the cited material property of steel from EN 1993-1.2 and AISC manuals. The FEM is validated by testing the modeled steel frame in simulated post-earthquake environments. Comparisons between the FEM analysis and the experimental results show that the FEM predicts the structural behavior of the steel frame in post-earthquake fire conditions reasonably. With experimental validations, the FEM analysis of critical structures could be continuously predicted for structures in these harsh environments for a better assistant to fire fighters in their rescue efforts and save fire victims.

  18. Understanding Earthquake Hazard & Disaster in Himalaya - A Perspective on Earthquake Forecast in Himalayan Region of South Central Tibet

    NASA Astrophysics Data System (ADS)

    Shanker, D.; Paudyal, ,; Singh, H.

    2010-12-01

    It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were

  19. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  20. Assessment of precast beam-column using capacity demand response spectrum subject to design basis earthquake and maximum considered earthquake

    NASA Astrophysics Data System (ADS)

    Ghani, Kay Dora Abd.; Tukiar, Mohd Azuan; Hamid, Nor Hayati Abdul

    2017-08-01

    Malaysia is surrounded by the tectonic feature of the Sumatera area which consists of two seismically active inter-plate boundaries, namely the Indo-Australian and the Eurasian Plates on the west and the Philippine Plates on the east. Hence, Malaysia experiences tremors from far distant earthquake occurring in Banda Aceh, Nias Island, Padang and other parts of Sumatera Indonesia. In order to predict the safety of precast buildings in Malaysia under near field ground motion the response spectrum analysis could be used for dealing with future earthquake whose specific nature is unknown. This paper aimed to develop of capacity demand response spectrum subject to Design Basis Earthquake (DBE) and Maximum Considered Earthquake (MCE) in order to assess the performance of precast beam column joint. From the capacity-demand response spectrum analysis, it can be concluded that the precast beam-column joints would not survive when subjected to earthquake excitation with surface-wave magnitude, Mw, of more than 5.5 Scale Richter (Type 1 spectra). This means that the beam-column joint which was designed using the current code of practice (BS8110) would be severely damaged when subjected to high earthquake excitation. The capacity-demand response spectrum analysis also shows that the precast beam-column joints in the prototype studied would be severely damaged when subjected to Maximum Considered Earthquake (MCE) with PGA=0.22g having a surface-wave magnitude of more than 5.5 Scale Richter, or Type 1 spectra.

  1. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions

    PubMed Central

    Burro, Roberto; Hall, Rob

    2017-01-01

    A major earthquake has a potentially highly traumatic impact on children’s psychological functioning. However, while many studies on children describe negative consequences in terms of mental health and psychiatric disorders, little is known regarding how the developmental processes of emotions can be affected following exposure to disasters. Objectives We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children’s emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. Method The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children’s understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. Results We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Conclusions Our data extend the generalizability of theoretical models on children’s psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and

  2. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions.

    PubMed

    Raccanello, Daniela; Burro, Roberto; Hall, Rob

    2017-01-01

    We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children's emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children's understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Our data extend the generalizability of theoretical models on children's psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and provide further knowledge on children's emotional resources related to natural disasters, as a basis for planning educational prevention programs.

  3. Earthquake ethics through scientific knowledge, historical memory and societal awareness: the experience of direct internet information.

    NASA Astrophysics Data System (ADS)

    de Rubeis, Valerio; Sbarra, Paola; Sebaste, Beppe; Tosi, Patrizia

    2013-04-01

    The experience of collection of data on earthquake effects and diffusion of information to people, carried on through the site "haisentitoilterremoto.it" (didyoufeelit) managed by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), has evidenced a constantly growing interest by Italian citizens. Started in 2007, the site has collected more than 520,000 compiled intensity questionnaires, producing intensity maps of almost 6,000 earthquakes. One of the most peculiar feature of this experience is constituted by a bi-directional information exchange. Every person can record observed effects of the earthquake and, at the same time, look at the generated maps. Seismologists, on the other side, can find each earthquake described in real time through its effects on the whole territory. In this way people, giving punctual information, receive global information from the community, mediated and interpreted by seismological knowledge. The relationship amongst seismologists, mass media and civil society is, thus, deep and rich. The presence of almost 20,000 permanent subscribers distributed on the whole Italian territory, alerted in case of earthquake, has reinforced the participation: the subscriber is constantly informed by the seismologists, through e-mail, about events occurred in his-her area, even if with very small magnitude. The "alert" service provides the possibility to remember that earthquakes are a phenomenon continuously present, on the other hand it shows that high magnitude events are very rare. This kind of information is helpful as it is fully complementary to that one given by media. We analyze the effects of our activity on society and mass media. The knowledge of seismic phenomena is present in each person, having roots on fear, idea of death and destruction, often with the deep belief of very rare occurrence. This position feeds refusal and repression. When a strong earthquake occurs, surprise immediately changes into shock and desperation. A

  4. A 30-year history of earthquake crisis communication in California and lessons for the future

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  5. Moment Magnitudes and Local Magnitudes for Small Earthquakes: Implications for Ground-Motion Prediction and b-values

    NASA Astrophysics Data System (ADS)

    Baltay, A.; Hanks, T. C.; Vernon, F.

    2016-12-01

    We illustrate two essential consequences of the systematic difference between moment magnitude and local magnitude for small earthquakes, illuminating the underlying earthquake physics. Moment magnitude, M 2/3 log M0, is uniformly valid for all earthquake sizes [Hanks and Kanamori, 1979]. However, the relationship between local magnitude ML and moment is itself magnitude dependent. For moderate events, 3< M < 7, M and M­L are coincident; for earthquakes smaller than M3, ML log M0 [Hanks and Boore, 1984]. This is a consequence of the saturation of the apparent corner frequency fc as it becoming greater than the largest observable frequency, fmax; In this regime, stress drop no longer controls ground motion. This implies that ML and M differ by a factor of 1.5 for these small events. While this idea is not new, its implications are important as more small-magnitude data are incorporated into earthquake hazard research. With a large dataset of M<3 earthquakes recorded on the ANZA network, we demonstrate striking consequences of the difference between M and ML. ML scales as the log peak ground motions (e.g., PGA or PGV) for these small earthquakes, which yields log PGA log M0 [Boore, 1986]. We plot nearly 15,000 records of PGA and PGV at close stations, adjusted for site conditions and for geometrical spreading to 10 km. The slope of the log of ground motion is 1.0*ML­, or 1.5*M, confirming the relationship, and that fc >> fmax. Just as importantly, if this relation is overlooked, prediction of large-magnitude ground motion from small earthquakes will be misguided. We also consider the effect of this magnitude scale difference on b-value. The oft-cited b-value of 1 should hold for small magnitudes, given M. Use of ML necessitates b=2/3 for the same data set; use of mixed, or unknown, magnitudes complicates the matter further. This is of particular import when estimating the rate of large earthquakes when one has limited data on their recurrence, as is the case for

  6. VAN method of short-term earthquake prediction shows promise

    NASA Astrophysics Data System (ADS)

    Uyeda, Seiya

    Although optimism prevailed in the 1970s, the present consensus on earthquake prediction appears to be quite pessimistic. However, short-term prediction based on geoelectric potential monitoring has stood the test of time in Greece for more than a decade [VarotsosandKulhanek, 1993] Lighthill, 1996]. The method used is called the VAN method.The geoelectric potential changes constantly due to causes such as magnetotelluric effects, lightning, rainfall, leakage from manmade sources, and electrochemical instabilities of electrodes. All of this noise must be eliminated before preseismic signals are identified, if they exist at all. The VAN group apparently accomplished this task for the first time. They installed multiple short (100-200m) dipoles with different lengths in both north-south and east-west directions and long (1-10 km) dipoles in appropriate orientations at their stations (one of their mega-stations, Ioannina, for example, now has 137 dipoles in operation) and found that practically all of the noise could be eliminated by applying a set of criteria to the data.

  7. The effects of earthquake measurement concepts and magnitude anchoring on individuals' perceptions of earthquake risk

    USGS Publications Warehouse

    Celsi, R.; Wolfinbarger, M.; Wald, D.

    2005-01-01

    The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.

  8. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  9. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-02

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  10. Magnitude Estimation for the 2011 Tohoku-Oki Earthquake Based on Ground Motion Prediction Equations

    NASA Astrophysics Data System (ADS)

    Eshaghi, Attieh; Tiampo, Kristy F.; Ghofrani, Hadi; Atkinson, Gail M.

    2015-08-01

    This study investigates whether real-time strong ground motion data from seismic stations could have been used to provide an accurate estimate of the magnitude of the 2011 Tohoku-Oki earthquake in Japan. Ultimately, such an estimate could be used as input data for a tsunami forecast and would lead to more robust earthquake and tsunami early warning. We collected the strong motion accelerograms recorded by borehole and free-field (surface) Kiban Kyoshin network stations that registered this mega-thrust earthquake in order to perform an off-line test to estimate the magnitude based on ground motion prediction equations (GMPEs). GMPEs for peak ground acceleration and peak ground velocity (PGV) from a previous study by Eshaghi et al. in the Bulletin of the Seismological Society of America 103. (2013) derived using events with moment magnitude ( M) ≥ 5.0, 1998-2010, were used to estimate the magnitude of this event. We developed new GMPEs using a more complete database (1998-2011), which added only 1 year but approximately twice as much data to the initial catalog (including important large events), to improve the determination of attenuation parameters and magnitude scaling. These new GMPEs were used to estimate the magnitude of the Tohoku-Oki event. The estimates obtained were compared with real time magnitude estimates provided by the existing earthquake early warning system in Japan. Unlike the current operational magnitude estimation methods, our method did not saturate and can provide robust estimates of moment magnitude within ~100 s after earthquake onset for both catalogs. It was found that correcting for average shear-wave velocity in the uppermost 30 m () improved the accuracy of magnitude estimates from surface recordings, particularly for magnitude estimates of PGV (Mpgv). The new GMPEs also were used to estimate the magnitude of all earthquakes in the new catalog with at least 20 records. Results show that the magnitude estimate from PGV values using

  11. Statistical validation of earthquake related observations

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  12. Laboratory constraints on models of earthquake recurrence

    USGS Publications Warehouse

    Beeler, Nicholas M.; Tullis, Terry; Junger, Jenni; Kilgore, Brian D.; Goldsby, David L.

    2014-01-01

    In this study, rock friction ‘stick-slip’ experiments are used to develop constraints on models of earthquake recurrence. Constant-rate loading of bare rock surfaces in high quality experiments produces stick-slip recurrence that is periodic at least to second order. When the loading rate is varied, recurrence is approximately inversely proportional to loading rate. These laboratory events initiate due to a slip rate-dependent process that also determines the size of the stress drop [Dieterich, 1979; Ruina, 1983] and as a consequence, stress drop varies weakly but systematically with loading rate [e.g., Gu and Wong, 1991; Karner and Marone, 2000; McLaskey et al., 2012]. This is especially evident in experiments where the loading rate is changed by orders of magnitude, as is thought to be the loading condition of naturally occurring, small repeating earthquakes driven by afterslip, or low-frequency earthquakes loaded by episodic slip. As follows from the previous studies referred to above, experimentally observed stress drops are well described by a logarithmic dependence on recurrence interval that can be cast as a non-linear slip-predictable model. The fault’s rate dependence of strength is the key physical parameter. Additionally, even at constant loading rate the most reproducible laboratory recurrence is not exactly periodic, unlike existing friction recurrence models. We present example laboratory catalogs that document the variance and show that in large catalogs, even at constant loading rate, stress drop and recurrence co-vary systematically. The origin of this covariance is largely consistent with variability of the dependence of fault strength on slip rate. Laboratory catalogs show aspects of both slip and time predictability and successive stress drops are strongly correlated indicating a ‘memory’ of prior slip history that extends over at least one recurrence cycle.

  13. Earthquake Predictability: Results From Aggregating Seismicity Data And Assessment Of Theoretical Individual Cases Via Synthetic Data

    NASA Astrophysics Data System (ADS)

    Adamaki, A.; Roberts, R.

    2016-12-01

    For many years an important aim in seismological studies has been forecasting the occurrence of large earthquakes. Despite some well-established statistical behavior of earthquake sequences, expressed by e.g. the Omori law for aftershock sequences and the Gutenburg-Richter distribution of event magnitudes, purely statistical approaches to short-term earthquake prediction have in general not been successful. It seems that better understanding of the processes leading to critical stress build-up prior to larger events is necessary to identify useful precursory activity, if this exists, and statistical analyses are an important tool in this context. There has been considerable debate on the usefulness or otherwise of foreshock studies for short-term earthquake prediction. We investigate generic patterns of foreshock activity using aggregated data and by studying not only strong but also moderate magnitude events. Aggregating empirical local seismicity time series prior to larger events observed in and around Greece reveals a statistically significant increasing rate of seismicity over 20 days prior to M>3.5 earthquakes. This increase cannot be explained by tempo-spatial clustering models such as ETAS, implying genuine changes in the mechanical situation just prior to larger events and thus the possible existence of useful precursory information. Because of tempo-spatial clustering, including aftershocks to foreshocks, even if such generic behavior exists it does not necessarily follow that foreshocks have the potential to provide useful precursory information for individual larger events. Using synthetic catalogs produced based on different clustering models and different presumed system sensitivities we are now investigating to what extent the apparently established generic foreshock rate acceleration may or may not imply that the foreshocks have potential in the context of routine forecasting of larger events. Preliminary results suggest that this is the case, but

  14. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    PubMed

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  15. Ground Motion Prediction for M7+ scenarios on the San Andreas Fault using the Virtual Earthquake Approach

    NASA Astrophysics Data System (ADS)

    Denolle, M.; Dunham, E. M.; Prieto, G.; Beroza, G. C.

    2013-05-01

    There is no clearer example of the increase in hazard due to prolonged and amplified shaking in sedimentary, than the case of Mexico City in the 1985 Michoacan earthquake. It is critically important to identify what other cities might be susceptible to similar basin amplification effects. Physics-based simulations in 3D crustal structure can be used to model and anticipate those effects, but they rely on our knowledge of the complexity of the medium. We propose a parallel approach to validate ground motion simulations using the ambient seismic field. We compute the Earth's impulse response combining the ambient seismic field and coda-wave enforcing causality and symmetry constraints. We correct the surface impulse responses to account for the source depth, mechanism and duration using a 1D approximation of the local surface-wave excitation. We call the new responses virtual earthquakes. We validate the ground motion predicted from the virtual earthquakes against moderate earthquakes in southern California. We then combine temporary seismic stations on the southern San Andreas Fault and extend the point source approximation of the Virtual Earthquake Approach to model finite kinematic ruptures. We confirm the coupling between source directivity and amplification in downtown Los Angeles seen in simulations.

  16. Report of Earthquake Drills with Experiences of Ground Motion in Childcare for Young Children, Japan

    NASA Astrophysics Data System (ADS)

    Yamada, N.

    2013-12-01

    After the Great East Japan Earthquake of 2011, this disaster has become one of the opportunities to raise awareness of earthquake and tsunami disaster prevention, and the improvement of disaster prevention education is to be emphasized. The influences of these bring the extension to the spatial axis in Japan, and also, it is important to make a development of the education with continuous to the expansion of time axes. Although fire or earthquake drills as the disaster prevention education are often found in Japan, the children and teachers only go from school building to outside. Besides, only the shortness of the time to spend for the drill often attracts attention. The complementary practice education by the cooperation with experts such as the firefighting is practiced, but the verification of the effects is not enough, and it is the present conditions that do not advance to the study either. Although it is expected that improvement and development of the disaster prevention educations are accomplished in future, there are a lot of the problems. Our target is construction and utilization of material contributing to the education about "During the strong motion" in case of the earthquake which may experience even if wherever of Japan. One of the our productions is the handicraft shaking table to utilize as teaching tools of the education to protect the body which is not hurt at the time of strong motion. This made much of simplicity than high reproduction of the earthquake ground motions. We aimed to helping the disaster prevention education including not only the education for young children but also for the school staff and their parents. In this report, the focusing on a way of the non-injured during the time of the earthquake ground motion, and adopting activity of the play, we are going to show the example of the framework of earthquake disaster prevention childcare through the virtual experience. This presentation has a discussion as a practice study with

  17. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  18. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    USGS Publications Warehouse

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  19. Volunteers in the earthquake hazard reduction program

    USGS Publications Warehouse

    Ward, P.L.

    1978-01-01

    With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 

  20. Thermal Infrared Anomalies of Several Strong Earthquakes

    PubMed Central

    Wei, Congxin; Guo, Xiao; Qin, Manzhong

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting. PMID:24222728

  1. Thermal infrared anomalies of several strong earthquakes.

    PubMed

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  2. Local earthquake interferometry of the IRIS Community Wavefield Experiment, Grant County, Oklahoma

    NASA Astrophysics Data System (ADS)

    Eddy, A. C.; Harder, S. H.

    2017-12-01

    The IRIS Community Wavefield Experiment was deployed in Grant County, located in north central Oklahoma, from June 21 to July 27, 2016. Data from all nodes were recorded at 250 samples per second between June 21 and July 20 along three lines. The main line was 12.5 km long oriented east-west and consisted of 129 nodes. The other two lines were 5.5 km long north-south oriented with 49 nodes each. During this time, approximately 150 earthquakes of magnitude 1.0 to 4.4 were recorded in the surrounding counties of Oklahoma and Kansas. Ideally, sources for local earthquake interferometry should be near surface events that produce high frequency body waves. Unlike ambient noise seismic interferometry (ANSI), which uses days, weeks, or even months of continuously recorded seismic data, local earthquake interferometry uses only short segments ( 2 min.) of data. Interferometry in this case is based on the cross-correlation of body wave surface multiples where the event source is translated to a reference station in the array, which acts as a virtual source. Multiples recorded between the reference station and all other stations can be cross-correlated to produce a clear seismic trace. This process will be repeated with every node acting as the reference station for all events. The resulting shot gather will then be processed and analyzed for quality and accuracy. Successful application of local earthquake interferometry will produce a crustal image with identifiable sedimentary and basement reflectors and possibly a Moho reflection. Economically, local earthquake interferometry could lower the time and resource cost of active and passive seismic surveys while improving subsurface image quality in urban settings or areas of limited access. The applications of this method can potentially be expanded with the inclusion of seismic events with a magnitude of 1.0 or lower.

  3. Disaster nursing experiences of Chinese nurses responding to the Sichuan Ya'an earthquake.

    PubMed

    Li, Y H; Li, S J; Chen, S H; Xie, X P; Song, Y Q; Jin, Z H; Zheng, X Y

    2017-06-01

    The aim of this study was to investigate the disaster experiences of nurses called to assist survivors one month after the 2013 Ya'an earthquake. China has experienced an increasing number of earthquake disasters in the past four decades. Although a health and disaster management system was initiated after the 2008 Wenchuan earthquake, nurses' roles and experiences in a disaster have been overlooked. The researchers used qualitative descriptive design that included 16 participants. Data were collected using semi-structured interviews and observation notes, after which a qualitative content analysis was conducted. Three major themes emerged: the process of being dispatched from hospitals to the disaster zone, the effort involved in getting to and working in the affected site and reflecting on the challenges they encountered. About half of the participants had received disaster nursing training before deploying to the disaster site, but they consistently expressed a lack of physical and psychological preparedness regarding the process of being dispatched from their hospitals to the disaster zone. This was a single-incident experience. Caution should be taken when trying to extend the findings to other parts of China. These findings highlighted the need for disaster in-service training as well as for having disaster plans in place. Hospital and nursing leaders should provide disaster training opportunities that included topics such as compiling resource inventories, formulating disaster drills and simulations, managing emergencies, and using emergency communication methods. Health policy-makers should be required to prioritize capacity-building training for front-line nurses as well as to develop and implement disaster management plans to better prepare nurses for future disasters. © 2016 International Council of Nurses.

  4. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  5. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  6. Schoolteachers' Traumatic Experiences and Responses in the Context of a Large-Scale Earthquake in China

    ERIC Educational Resources Information Center

    Lei, B.

    2017-01-01

    This article investigates the traumatic experience of teachers who experienced the 2008 earthquake in Sichuan, China. A survey measuring participants' personal experiences, professional demands, and psychological responses was distributed to 241 teachers in five selected schools. Although the status of schoolteachers' trauma in a postdisaster…

  7. Adaptive vibration control of structures under earthquakes

    NASA Astrophysics Data System (ADS)

    Lew, Jiann-Shiun; Juang, Jer-Nan; Loh, Chin-Hsiung

    2017-04-01

    techniques, for structural vibration suppression under earthquakes. Various control strategies have been developed to protect structures from natural hazards and improve the comfort of occupants in buildings. However, there has been little development of adaptive building control with the integration of real-time system identification and control design. Generalized predictive control, which combines the process of real-time system identification and the process of predictive control design, has received widespread acceptance and has been successfully applied to various test-beds. This paper presents a formulation of the predictive control scheme for adaptive vibration control of structures under earthquakes. Comprehensive simulations are performed to demonstrate and validate the proposed adaptive control technique for earthquake-induced vibration of a building.

  8. Magnitude Estimation for Large Earthquakes from Borehole Recordings

    NASA Astrophysics Data System (ADS)

    Eshaghi, A.; Tiampo, K. F.; Ghofrani, H.; Atkinson, G.

    2012-12-01

    We present a simple and fast method for magnitude determination technique for earthquake and tsunami early warning systems based on strong ground motion prediction equations (GMPEs) in Japan. This method incorporates borehole strong motion records provided by the Kiban Kyoshin network (KiK-net) stations. We analyzed strong ground motion data from large magnitude earthquakes (5.0 ≤ M ≤ 8.1) with focal depths < 50 km and epicentral distances of up to 400 km from 1996 to 2010. Using both peak ground acceleration (PGA) and peak ground velocity (PGV) we derived GMPEs in Japan. These GMPEs are used as the basis for regional magnitude determination. Predicted magnitudes from PGA values (Mpga) and predicted magnitudes from PGV values (Mpgv) were defined. Mpga and Mpgv strongly correlate with the moment magnitude of the event, provided sufficient records for each event are available. The results show that Mpgv has a smaller standard deviation in comparison to Mpga when compared with the estimated magnitudes and provides a more accurate early assessment of earthquake magnitude. We test this new method to estimate the magnitude of the 2011 Tohoku earthquake and we present the results of this estimation. PGA and PGV from borehole recordings allow us to estimate the magnitude of this event 156 s and 105 s after the earthquake onset, respectively. We demonstrate that the incorporation of borehole strong ground-motion records immediately available after the occurrence of large earthquakes significantly increases the accuracy of earthquake magnitude estimation and the associated improvement in earthquake and tsunami early warning systems performance. Moment magnitude versus predicted magnitude (Mpga and Mpgv).

  9. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Edward; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  10. a Collaborative Cyberinfrastructure for Earthquake Seismology

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  11. Prediction of maximum earthquake intensities for the San Francisco Bay region

    USGS Publications Warehouse

    Borcherdt, Roger D.; Gibbs, James F.

    1975-01-01

    The intensity data for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan Formation is: Intensity = 2.69 - 1.90 log (Distance) (km). For sites on other geologic units intensity increments, derived with respect to this empirical relation, correlate strongly with the Average Horizontal Spectral Amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is: Intensity Increment = 0.27 +2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan Formation, 0.64 for the Great Valley Sequence, 0.82 for Santa Clara Formation, 1.34 for alluvium, 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hazard fault.

  12. ShakeMap-based prediction of earthquake-induced mass movements in Switzerland calibrated on historical observations

    USGS Publications Warehouse

    Cauzzi, Carlo; Fah, Donat; Wald, David J.; Clinton, John; Losey, Stephane; Wiemer, Stefan

    2018-01-01

    In Switzerland, nearly all historical Mw ~ 6 earthquakes have induced damaging landslides, rockslides and snow avalanches that, in some cases, also resulted in damage to infrastructure and loss of lives. We describe the customisation to Swiss conditions of a globally calibrated statistical approach originally developed to rapidly assess earthquake-induced landslide likelihoods worldwide. The probability of occurrence of such earthquake-induced effects is modelled through a set of geospatial susceptibility proxies and peak ground acceleration. The predictive model is tuned to capture the observations from past events and optimised for near-real-time estimates based on USGS-style ShakeMaps routinely produced by the Swiss Seismological Service. Our emphasis is on the use of high-resolution geospatial datasets along with additional local information on ground failure susceptibility. Even if calibrated on historic events with moderate magnitudes, the methodology presented in this paper yields sensible results also for low-magnitude recent events. The model is integrated in the Swiss ShakeMap framework. This study has a high practical relevance to many Swiss ShakeMap stakeholders, especially those managing lifeline systems, and to other global users interested in conducting a similar customisation for their region of interest.

  13. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  14. Geodetic Finite-Fault-based Earthquake Early Warning Performance for Great Earthquakes Worldwide

    NASA Astrophysics Data System (ADS)

    Ruhl, C. J.; Melgar, D.; Grapenthin, R.; Allen, R. M.

    2017-12-01

    GNSS-based earthquake early warning (EEW) algorithms estimate fault-finiteness and unsaturated moment magnitude for the largest, most damaging earthquakes. Because large events are infrequent, algorithms are not regularly exercised and insufficiently tested on few available datasets. The Geodetic Alarm System (G-larmS) is a GNSS-based finite-fault algorithm developed as part of the ShakeAlert EEW system in the western US. Performance evaluations using synthetic earthquakes offshore Cascadia showed that G-larmS satisfactorily recovers magnitude and fault length, providing useful alerts 30-40 s after origin time and timely warnings of ground motion for onshore urban areas. An end-to-end test of the ShakeAlert system demonstrated the need for GNSS data to accurately estimate ground motions in real-time. We replay real data from several subduction-zone earthquakes worldwide to demonstrate the value of GNSS-based EEW for the largest, most damaging events. We compare predicted ground acceleration (PGA) from first-alert-solutions with those recorded in major urban areas. In addition, where applicable, we compare observed tsunami heights to those predicted from the G-larmS solutions. We show that finite-fault inversion based on GNSS-data is essential to achieving the goals of EEW.

  15. Bio-Mimetics of Disaster Anticipation-Learning Experience and Key-Challenges.

    PubMed

    Tributsch, Helmut

    2013-03-19

    Anomalies in animal behavior and meteorological phenomena before major earthquakes have been reported throughout history. Bio-mimetics or bionics aims at learning disaster anticipation from animals. Since modern science is reluctant to address this problem an effort has been made to track down the knowledge available to ancient natural philosophers. Starting with an archaeologically documented human sacrifice around 1700 B.C. during the Minoan civilization immediately before a large earthquake, which killed the participants, earthquake prediction knowledge throughout antiquity is evaluated. Major practical experience with this phenomenon has been gained from a Chinese earthquake prediction initiative nearly half a century ago. Some quakes, like that of Haicheng, were recognized in advance. However, the destructive Tangshan earthquake was not predicted, which was interpreted as an inherent failure of prediction based on animal phenomena. This is contradicted on the basis of reliable Chinese documentation provided by the responsible earthquake study commission. The Tangshan earthquake was preceded by more than 2,000 reported animal anomalies, some of which were of very dramatic nature. They are discussed here. Any physical phenomenon, which may cause animal unrest, must involve energy turnover before the main earthquake event. The final product, however, of any energy turnover is heat. Satellite based infrared measurements have indeed identified significant thermal anomalies before major earthquakes. One of these cases, occurring during the 2001 Bhuj earthquake in Gujarat, India, is analyzed together with parallel animal anomalies observed in the Gir national park. It is suggested that the time window is identical and that both phenomena have the same geophysical origin. It therefore remains to be demonstrated that energy can be released locally before major earthquake events. It is shown that by considering appropriate geophysical feedback processes, this is

  16. Comparisons of ground motions from five aftershocks of the 1999 Chi-Chi, Taiwan, earthquake with empirical predictions largely based on data from California

    USGS Publications Warehouse

    Wang, G.-Q.; Boore, D.M.; Igel, H.; Zhou, X.-Y.

    2004-01-01

    The observed ground motions from five large aftershocks of the 1999 Chi-Chi, Taiwan, earthquake are compared with predictions from four equations based primarily on data from California. The four equations for active tectonic regions are those developed by Abrahamson and Silva (1997), Boore et al. (1997), Campbell (1997, 2001), and Sadigh et al. (1997). Comparisons are made for horizontal-component peak ground accelerations and 5%-damped pseudoacceleration response spectra at periods between 0.02 sec and 5 sec. The observed motions are in reasonable agreement with the predictions, particularly for distances from 10 to 30 km. This is in marked contrast to the motions from the Chi-Chi mainshock, which are much lower than the predicted motions for periods less than about 1 sec. The results indicate that the low motions in the mainshock are not due to unusual, localized absorption of seismic energy, because waves from the mainshock and the aftershocks generally traverse the same section of the crust and are recorded at the same stations. The aftershock motions at distances of 30-60 km are somewhat lower than the predictions (but not nearly by as small a factor as those for the mainshock), suggesting that the ground motion attenuates more rapidly in this region of Taiwan than it does in the areas we compare with it. We provide equations for the regional attenuation of response spectra, which show increasing decay of motion with distance for decreasing oscillator periods. This observational study also demonstrates that ground motions have large earthquake-location-dependent variability for a specific site. This variability reduces the accuracy with which an earthquake-specific prediction of site response can be predicted. Online Material: PGAs and PSAs from the 1999 Chi-Chi earthquake and five aftershocks.

  17. Statistical tests of simple earthquake cycle models

    NASA Astrophysics Data System (ADS)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM < 4.0 × 1019 Pa s and ηM > 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  18. Distribution and Characteristics of Repeating Earthquakes in Northern California

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Schaff, D. P.; Zechar, J. D.; Shaw, B. E.

    2012-12-01

    show burst-like behavior with mean recurrence times smaller than one month. 5% of the RES have mean recurrence times greater than one year and include more than 10 earthquakes. Earthquakes in the 50 most periodic sequences (CV<0.2) do not appear to be predictable by either time- or slip-predictable models, consistent with previous findings. We demonstrate that changes in recurrence intervals of repeating earthquakes can be routinely monitored. This is especially important for sequences with CV~0, as they may indicate changes in the loading rate. We also present results from retrospective forecast experiments based on near-real time hazard functions.

  19. CyberShake-derived ground-motion prediction models for the Los Angeles region with application to earthquake early warning

    USGS Publications Warehouse

    Bose, Maren; Graves, Robert; Gill, David; Callaghan, Scott; Maechling, Phillip J.

    2014-01-01

    Real-time applications such as earthquake early warning (EEW) typically use empirical ground-motion prediction equations (GMPEs) along with event magnitude and source-to-site distances to estimate expected shaking levels. In this simplified approach, effects due to finite-fault geometry, directivity and site and basin response are often generalized, which may lead to a significant under- or overestimation of shaking from large earthquakes (M > 6.5) in some locations. For enhanced site-specific ground-motion predictions considering 3-D wave-propagation effects, we develop support vector regression (SVR) models from the SCEC CyberShake low-frequency (<0.5 Hz) and broad-band (0–10 Hz) data sets. CyberShake encompasses 3-D wave-propagation simulations of >415 000 finite-fault rupture scenarios (6.5 ≤ M ≤ 8.5) for southern California defined in UCERF 2.0. We use CyberShake to demonstrate the application of synthetic waveform data to EEW as a ‘proof of concept’, being aware that these simulations are not yet fully validated and might not appropriately sample the range of rupture uncertainty. Our regression models predict the maximum and the temporal evolution of instrumental intensity (MMI) at 71 selected test sites using only the hypocentre, magnitude and rupture ratio, which characterizes uni- and bilateral rupture propagation. Our regression approach is completely data-driven (where here the CyberShake simulations are considered data) and does not enforce pre-defined functional forms or dependencies among input parameters. The models were established from a subset (∼20 per cent) of CyberShake simulations, but can explain MMI values of all >400 k rupture scenarios with a standard deviation of about 0.4 intensity units. We apply our models to determine threshold magnitudes (and warning times) for various active faults in southern California that earthquakes need to exceed to cause at least ‘moderate’, ‘strong’ or ‘very strong’ shaking

  20. Space technologies for short-term earthquake warning

    NASA Astrophysics Data System (ADS)

    Pulinets, S.

    Recent theoretical and experimental studies explicitly demonstrated the ability of space technologies to identify and monitor the specific variations at near-earth space plasma, atmosphere and ground surface associated with approaching severe earthquakes (named as earthquake precursors) appearing several days (from 1 to 5) before the seismic shock over the seismically active areas. Several countries and private companies are in the stage of preparation (or already launched) the dedicated spacecrafts for monitoring of the earthquake precursors from space and for short-term earthquake prediction. The present paper intends to outline the optimal algorithm for creation of the space-borne system for the earthquake precursors monitoring and for short-term earthquake prediction. It takes into account the following considerations: Selection of the precursors in the terms of priority, taking into account their statistical and physical parameters Configuration of the spacecraft payload Configuration of the satellite constellation (orbit selection, satellite distribution, operation schedule) Proposal of different options (cheap microsatellite or comprehensive multisatellite constellation) Taking into account that the most promising are the ionospheric precursors of earthquakes, the special attention will be devoted to the radiophysical techniques of the ionosphere monitoring. The advantages and disadvantages of such technologies as vertical sounding, in-situ probes, ionosphere tomography, GPS TEC and GPS MET technologies will be considered.

  1. Space technologies for short-term earthquake warning

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.

    Recent theoretical and experimental studies explicitly demonstrated the ability of space technologies to identify and monitor the specific variations at near-earth space plasma, atmosphere and ground surface associated with approaching severe earthquakes (named as earthquake precursors) which appear several days (from 1 to 5) before the seismic shock over the seismically active areas. Several countries and private companies are in the stage of preparation (or already launched) the dedicated spacecrafts for monitoring of the earthquake precursors from space and for short-term earthquake prediction. The present paper intends to outline the optimal algorithm for creation of the space-borne system for the earthquake precursors monitoring and for short-term earthquake prediction. It takes into account the following: Selection of the precursors in the terms of priority, considering their statistical and physical parameters.Configuration of the spacecraft payload.Configuration of the satellite constellation (orbit selection, satellite distribution, operation schedule).Different options of the satellite systems (cheap microsatellite or comprehensive multisatellite constellation). Taking into account that the most promising are the ionospheric precursors of earthquakes, the special attention is devoted to the radiophysical techniques of the ionosphere monitoring. The advantages and disadvantages of such technologies as vertical sounding, in-situ probes, ionosphere tomography, GPS TEC and GPS MET technologies are considered.

  2. Modified Mercalli Intensity for scenario earthquakes in Evansville, Indiana

    USGS Publications Warehouse

    Cramer, Chris; Haase, Jennifer; Boyd, Oliver

    2012-01-01

    Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the fact that Evansville is close to the Wabash Valley and New Madrid seismic zones, there is concern about the hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake. Earthquake-hazard maps provide one way of conveying such estimates of strong ground shaking and will help the region prepare for future earthquakes and reduce earthquake-caused losses.

  3. Coping with the challenges of early disaster response: 24 years of field hospital experience after earthquakes.

    PubMed

    Bar-On, Elhanan; Abargel, Avi; Peleg, Kobi; Kreiss, Yitshak

    2013-10-01

    To propose strategies and recommendations for future planning and deployment of field hospitals after earthquakes by comparing the experience of 4 field hospitals deployed by The Israel Defense Forces (IDF) Medical Corps in Armenia, Turkey, India and Haiti. Quantitative data regarding the earthquakes were collected from published sources; data regarding hospital activity were collected from IDF records; and qualitative information was obtained from structured interviews with key figures involved in the missions. The hospitals started operating between 89 and 262 hours after the earthquakes. Their sizes ranged from 25 to 72 beds, and their personnel numbered between 34 and 100. The number of patients treated varied from 1111 to 2400. The proportion of earthquake-related diagnoses ranged from 28% to 67% (P < .001), with hospitalization rates between 3% and 66% (P < .001) and surgical rates from 1% to 24% (P < .001). In spite of characteristic scenarios and injury patterns after earthquakes, patient caseload and treatment requirements varied widely. The variables affecting the patient profile most significantly were time until deployment, total number of injured, availability of adjacent medical facilities, and possibility of evacuation from the disaster area. When deploying a field hospital in the early phase after an earthquake, a wide variability in patient caseload should be anticipated. Customization is difficult due to the paucity of information. Therefore, early deployment necessitates full logistic self-sufficiency and operational versatility. Also, collaboration with local and international medical teams can greatly enhance treatment capabilities.

  4. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  5. Variations in rupture process with recurrence interval in a repeated small earthquake

    USGS Publications Warehouse

    Vidale, J.E.; Ellsworth, W.L.; Cole, A.; Marone, Chris

    1994-01-01

    In theory and in laboratory experiments, friction on sliding surfaces such as rock, glass and metal increases with time since the previous episode of slip. This time dependence is a central pillar of the friction laws widely used to model earthquake phenomena. On natural faults, other properties, such as rupture velocity, porosity and fluid pressure, may also vary with the recurrence interval. Eighteen repetitions of the same small earthquake, separated by intervals ranging from a few days to several years, allow us to test these laboratory predictions in situ. The events with the longest time since the previous earthquake tend to have about 15% larger seismic moment than those with the shortest intervals, although this trend is weak. In addition, the rupture durations of the events with the longest recurrence intervals are more than a factor of two shorter than for the events with the shortest intervals. Both decreased duration and increased friction are consistent with progressive fault healing during the time of stationary contact.In theory and in laboratory experiments, friction on sliding surfaces such as rock, glass and metal increases with time since the previous episode of slip. This time dependence is a central pillar of the friction laws widely used to model earthquake phenomena. On natural faults, other properties, such as rupture velocity, porosity and fluid pressure, may also vary with the recurrence interval. Eighteen repetitions of the same small earthquake, separated by intervals ranging from a few days to several years, allow us to test these laboratory predictions in situ. The events with the longest time since the previous earthquake tend to have about 15% larger seismic moment than those with the shortest intervals, although this trend is weak. In addition, the rupture durations of the events with the longest recurrence intervals are more than a factor of two shorter than for the events with the shortest intervals. Both decreased duration and

  6. From integrated observation of pre-earthquake signals towards physical-based forecasting: A prospective test experiment

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Pulinets, S. A.; Tramutoli, V.; Lee, L.; Liu, J. G.; Hattori, K.; Kafatos, M.

    2013-12-01

    We are conducting an integrated study involving multi-parameter observations over different seismo- tectonics regions in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several selected parameters namely: gas discharge; thermal infrared radiation; ionospheric electron concentration; and atmospheric temperature and humidity, which we suppose are associated with earthquake preparation phase. We intended to test in prospective mode the set of geophysical measurements for different regions of active earthquakes and volcanoes. In 2012-13 we established a collaborative framework with the leading projects PRE-EARTHQUAKE (EU) and iSTEP3 (Taiwan) for coordinate measurements and prospective validation over seven test regions: Southern California (USA), Eastern Honshu (Japan), Italy, Turkey, Greece, Taiwan (ROC), Kamchatka and Sakhalin (Russia). The current experiment provided a 'stress test' opportunity to validate the physical based approach in teal -time over regions of high seismicity. Our initial results are: (1) Prospective tests have shown the presence in real time of anomalies in the atmosphere before most of the significant (M>5.5) earthquakes in all regions; (2) False positive rate alarm is different for each region and varying between 50% (Italy, Kamchatka and California) to 25% (Taiwan and Japan) with a significant reduction of false positives when at least two parameters are contemporary used; (3) One of most complex problem, which is still open, was the systematic collection and real-time integration of pre-earthquake observations. Our findings suggest that the physical based short-term forecast is feasible and more tests are needed. We discus the physical concept we used, the future integration of data observations and related developments.

  7. Long aftershock sequences within continents and implications for earthquake hazard assessment.

    PubMed

    Stein, Seth; Liu, Mian

    2009-11-05

    One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere.

  8. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  9. Earthquake warning system for Japan Railways’ bullet train; implications for disaster prevention in California

    USGS Publications Warehouse

    Nakamura, Y.; Tucker, B. E.

    1988-01-01

    Today, Japanese society is well aware of the prediction of the Tokai earthquake. It is estimated by the Tokyo earthquake. It is estimated by the Tokyo muncipal government that this predicted earthquake could kill 30,000 people. (this estimate is viewed by many as conservative; other Japanese government agencies have made estimates but they have not been published.) Reduction in the number deaths from 120,000 to 30,000 between the Kanto earthquake and the predicted Tokai earthquake is due in large part to the reduction in the proportion of wooden construction (houses). 

  10. Weather Satellite Thermal IR Responses Prior to Earthquakes

    NASA Technical Reports Server (NTRS)

    OConnor, Daniel P.

    2005-01-01

    A number of observers claim to have seen thermal anomalies prior to earthquakes, but subsequent analysis by others has failed to produce similar findings. What exactly are these anomalies? Might they be useful for earthquake prediction? It is the purpose of this study to determine if thermal anomalies can be found in association with known earthquakes by systematically co-registering weather satellite images at the sub-pixel level and then determining if statistically significant responses occurred prior to the earthquake event. A new set of automatic co-registration procedures was developed for this task to accommodate all properties particular to weather satellite observations taken at night, and it relies on the general condition that the ground cools after sunset. Using these procedures, we can produce a set of temperature-sensitive satellite images for each of five selected earthquakes (Algeria 2003; Bhuj, India 2001; Izmit, Turkey 2001; Kunlun Shan, Tibet 2001; Turkmenistan 2000) and thus more effectively investigate heating trends close to the epicenters a few hours prior to the earthquake events. This study will lay tracks for further work in earthquake prediction and provoke the question of the exact nature of the thermal anomalies.

  11. VLF/LF Radio Sounding of Ionospheric Perturbations Associated with Earthquakes

    PubMed Central

    Hayakawa, Masashi

    2007-01-01

    It is recently recognized that the ionosphere is very sensitive to seismic effects, and the detection of ionospheric perturbations associated with earthquakes, seems to be very promising for short-term earthquake prediction. We have proposed a possible use of VLF/LF (very low frequency (3-30 kHz) /low frequency (30-300 kHz)) radio sounding of the seismo-ionospheric perturbations. A brief history of the use of subionospheric VLF/LF propagation for the short-term earthquake prediction is given, followed by a significant finding of ionospheric perturbation for the Kobe earthquake in 1995. After showing previous VLF/LF results, we present the latest VLF/LF findings; One is the statistical correlation of the ionospheric perturbation with earthquakes and the second is a case study for the Sumatra earthquake in December, 2004, indicating the spatical scale and dynamics of ionospheric perturbation for this earthquake.

  12. Predicted Surface Displacements for Scenario Earthquakes in the San Francisco Bay Region

    USGS Publications Warehouse

    Murray-Moraleda, Jessica R.

    2008-01-01

    In the immediate aftermath of a major earthquake, the U.S. Geological Survey (USGS) will be called upon to provide information on the characteristics of the event to emergency responders and the media. One such piece of information is the expected surface displacement due to the earthquake. In conducting probabilistic hazard analyses for the San Francisco Bay Region, the Working Group on California Earthquake Probabilities (WGCEP) identified a series of scenario earthquakes involving the major faults of the region, and these were used in their 2003 report (hereafter referred to as WG03) and the recently released 2008 Uniform California Earthquake Rupture Forecast (UCERF). Here I present a collection of maps depicting the expected surface displacement resulting from those scenario earthquakes. The USGS has conducted frequent Global Positioning System (GPS) surveys throughout northern California for nearly two decades, generating a solid baseline of interseismic measurements. Following an earthquake, temporary GPS deployments at these sites will be important to augment the spatial coverage provided by continuous GPS sites for recording postseismic deformation, as will the acquisition of Interferometric Synthetic Aperture Radar (InSAR) scenes. The information provided in this report allows one to anticipate, for a given event, where the largest displacements are likely to occur. This information is valuable both for assessing the need for further spatial densification of GPS coverage before an event and prioritizing sites to resurvey and InSAR data to acquire in the immediate aftermath of the earthquake. In addition, these maps are envisioned to be a resource for scientists in communicating with emergency responders and members of the press, particularly during the time immediately after a major earthquake before displacements recorded by continuous GPS stations are available.

  13. Assessing the capability of numerical methods to predict earthquake ground motion: the Euroseistest verification and validation project

    NASA Astrophysics Data System (ADS)

    Chaljub, E. O.; Bard, P.; Tsuno, S.; Kristek, J.; Moczo, P.; Franek, P.; Hollender, F.; Manakou, M.; Raptakis, D.; Pitilakis, K.

    2009-12-01

    During the last decades, an important effort has been dedicated to develop accurate and computationally efficient numerical methods to predict earthquake ground motion in heterogeneous 3D media. The progress in methods and increasing capability of computers have made it technically feasible to calculate realistic seismograms for frequencies of interest in seismic design applications. In order to foster the use of numerical simulation in practical prediction, it is important to (1) evaluate the accuracy of current numerical methods when applied to realistic 3D applications where no reference solution exists (verification) and (2) quantify the agreement between recorded and numerically simulated earthquake ground motion (validation). Here we report the results of the Euroseistest verification and validation project - an ongoing international collaborative work organized jointly by the Aristotle University of Thessaloniki, Greece, the Cashima research project (supported by the French nuclear agency, CEA, and the Laue-Langevin institute, ILL, Grenoble), and the Joseph Fourier University, Grenoble, France. The project involves more than 10 international teams from Europe, Japan and USA. The teams employ the Finite Difference Method (FDM), the Finite Element Method (FEM), the Global Pseudospectral Method (GPSM), the Spectral Element Method (SEM) and the Discrete Element Method (DEM). The project makes use of a new detailed 3D model of the Mygdonian basin (about 5 km wide, 15 km long, sediments reach about 400 m depth, surface S-wave velocity is 200 m/s). The prime target is to simulate 8 local earthquakes with magnitude from 3 to 5. In the verification, numerical predictions for frequencies up to 4 Hz for a series of models with increasing structural and rheological complexity are analyzed and compared using quantitative time-frequency goodness-of-fit criteria. Predictions obtained by one FDM team and the SEM team are close and different from other predictions

  14. Oklahoma experiences largest earthquake during ongoing regional wastewater injection hazard mitigation efforts

    USGS Publications Warehouse

    Yeck, William; Hayes, Gavin; McNamara, Daniel E.; Rubinstein, Justin L.; Barnhart, William; Earle, Paul; Benz, Harley M.

    2017-01-01

    The 3 September 2016, Mw 5.8 Pawnee earthquake was the largest recorded earthquake in the state of Oklahoma. Seismic and geodetic observations of the Pawnee sequence, including precise hypocenter locations and moment tensor modeling, shows that the Pawnee earthquake occurred on a previously unknown left-lateral strike-slip basement fault that intersects the mapped right-lateral Labette fault zone. The Pawnee earthquake is part of an unprecedented increase in the earthquake rate in Oklahoma that is largely considered the result of the deep injection of waste fluids from oil and gas production. If this is, indeed, the case for the M5.8 Pawnee earthquake, then this would be the largest event to have been induced by fluid injection. Since 2015, Oklahoma has undergone wide-scale mitigation efforts primarily aimed at reducing injection volumes. Thus far in 2016, the rate of M3 and greater earthquakes has decreased as compared to 2015, while the cumulative moment—or energy released from earthquakes—has increased. This highlights the difficulty in earthquake hazard mitigation efforts given the poorly understood long-term diffusive effects of wastewater injection and their connection to seismicity.

  15. Statistical tests of simple earthquake cycle models

    USGS Publications Warehouse

    Devries, Phoebe M. R.; Evans, Eileen

    2016-01-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM <~ 4.0 × 1019 Pa s and ηM >~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  16. Ground-motion parameters of the southwestern Indiana earthquake of 18 June 2002 and the disparity between the observed and predicted values

    USGS Publications Warehouse

    Street, R.; Wiegand, J.; Woolery, E.W.; Hart, P.

    2005-01-01

    The M 4.5 southwestern Indiana earthquake of 18 June 2002 triggered 46 blast monitors in Indiana, Illinois, and Kentucky. The resulting free-field particle velocity records, along with similar data from previous earthquakes in the study area, provide a clear standard for judging the reliability of current maps for predicting ground motions greater than 2 Hz in southwestern Indiana and southeastern Illinois. Peak horizontal accelerations and velocities, and 5% damped pseudo-accelerations for the earthquake, generally exceeded ground motions predicted for the top of the bedrock by factors of 2 or more, even after soil amplifications were taken into consideration. It is suggested, but not proven, that the low shear-wave velocity and weathered bedrock in the area are also amplifying the higher-frequency ground motions that have been repeatedly recorded by the blast monitors in the study area. It is also shown that there is a good correlation between the peak ground motions and 5% pseudo-accelerations recorded for the event, and the Modified Mercalli intensities interpreted for the event by the U.S. Geological Survey.

  17. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed Central

    Kanamori, H

    1996-01-01

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding. Images Fig. 8 PMID:11607657

  18. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    PubMed

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  19. Surgical Management of Musculoskeletal Injuries after 2015 Nepal Earthquake: Our Experience

    PubMed Central

    Vaishya, Raju; Vijay, Vipul; Hussaini, Mustafa; Singh, Harsh

    2015-01-01

    We report our experience of handling 80 major musculoskeletal injuries in a brief span of three days immediately after the major earthquake of Nepal in April 2015. Planning, proper utilization of resources, and prioritizing the patients for surgical intervention is highlighted. The value of damage control by orthopaedics in these disasters is discussed. Timely and appropriate surgical treatment by a skilled orthopaedic team not only can save these injured limbs but also the lives of the victims of a major disaster. PMID:26430580

  20. Earthquake mechanism and predictability shown by a laboratory fault

    USGS Publications Warehouse

    King, C.-Y.

    1994-01-01

    Slip events generated in a laboratory fault model consisting of a circulinear chain of eight spring-connected blocks of approximately equal weight elastically driven to slide on a frictional surface are studied. It is found that most of the input strain energy is released by a relatively few large events, which are approximately time predictable. A large event tends to roughen stress distribution along the fault, whereas the subsequent smaller events tend to smooth the stress distribution and prepare a condition of simultaneous criticality for the occurrence of the next large event. The frequency-size distribution resembles the Gutenberg-Richter relation for earthquakes, except for a falloff for the largest events due to the finite energy-storage capacity of the fault system. Slip distributions, in different events are commonly dissimilar. Stress drop, slip velocity, and rupture velocity all tend to increase with event size. Rupture-initiation locations are usually not close to the maximum-slip locations. ?? 1994 Birkha??user Verlag.

  1. High Attenuation Rate for Shallow, Small Earthquakes in Japan

    NASA Astrophysics Data System (ADS)

    Si, Hongjun; Koketsu, Kazuki; Miyake, Hiroe

    2017-09-01

    We compared the attenuation characteristics of peak ground accelerations (PGAs) and velocities (PGVs) of strong motion from shallow, small earthquakes that occurred in Japan with those predicted by the equations of Si and Midorikawa (J Struct Constr Eng 523:63-70, 1999). The observed PGAs and PGVs at stations far from the seismic source decayed more rapidly than the predicted ones. The same tendencies have been reported for deep, moderate, and large earthquakes, but not for shallow, moderate, and large earthquakes. This indicates that the peak values of ground motion from shallow, small earthquakes attenuate more steeply than those from shallow, moderate or large earthquakes. To investigate the reason for this difference, we numerically simulated strong ground motion for point sources of M w 4 and 6 earthquakes using a 2D finite difference method. The analyses of the synthetic waveforms suggested that the above differences are caused by surface waves, which are predominant at stations far from the seismic source for shallow, moderate earthquakes but not for shallow, small earthquakes. Thus, although loss due to reflection at the boundaries of the discontinuous Earth structure occurs in all shallow earthquakes, the apparent attenuation rate for a moderate or large earthquake is essentially the same as that of body waves propagating in a homogeneous medium due to the dominance of surface waves.

  2. Geoethical suggestions for reducing risk of next (not only strong) earthquakes

    NASA Astrophysics Data System (ADS)

    Nemec, Vaclav

    2013-04-01

    deaths (incomparably lower than tragic events from 1923) the tsunami has broken any known record. The existing anti-tsunami measures have appeared to be appropriate to expectations given by unsatisfactory safety limits extended to the human memory experience. Conclusions of geoethics: a) a new legal interpretation of "false alarms" and reasonable risk and danger levels is to be established (up-dating internationally acceptable definitions and protection measures); b) any positive prediction for any known real natural disaster (whoever made it) is to be precisely analysed by competent institutes avoiding any underestimation of "incompetent" researchers and amateurs and respecting diversity of scientific research "schools"; c) a reciprocal respect between scientists and the population is to be based on the use of a reciprocally understandable language; d) scientists as well as media are obliged to respect and publish the complete truth about facts with clearly defined words to avoid any misinterpretation of results; e) consequences of relatively "minor" earthquakes are no more limited only to an adjacent local area; f) the appropriate programs for computerized predictions are to be under a permanent control of validity (using alternative parameters and incorporating verified or supposed time-tables of events from the past); g) any scientist when accepting a function in a State organ has to accept his role with high personal responsibility for and respect to the goals, work and results of such a commission; h) any effective prevention of the population is to be based on a mutual consensus preferring in any stage the common good instead of particular or personal interests and respecting human lives as the top value priority.

  3. Bio-Mimetics of Disaster Anticipation—Learning Experience and Key-Challenges

    PubMed Central

    Tributsch, Helmut

    2013-01-01

    Simple Summary Starting from 1700 B.C. in the old world and up to recent times in China there is evidence of earthquake prediction based on unusual metrological phenomena and animal behavior. The review tries to explore the credibility and to pin down the nature of geophysical phenomena involved. It appears that the concept of ancient Greek philosophers in that a dry gas, pneuma is correlated with earthquakes, is relevant. It is not the cause of earthquakes, as originally thought, but may be an accompanying phenomenon and occasional precursor. This would explain unusual animal behavior as well as thermal anomalies detected from satellites. Abstract Anomalies in animal behavior and meteorological phenomena before major earthquakes have been reported throughout history. Bio-mimetics or bionics aims at learning disaster anticipation from animals. Since modern science is reluctant to address this problem an effort has been made to track down the knowledge available to ancient natural philosophers. Starting with an archaeologically documented human sacrifice around 1700 B.C. during the Minoan civilization immediately before a large earthquake, which killed the participants, earthquake prediction knowledge throughout antiquity is evaluated. Major practical experience with this phenomenon has been gained from a Chinese earthquake prediction initiative nearly half a century ago. Some quakes, like that of Haicheng, were recognized in advance. However, the destructive Tangshan earthquake was not predicted, which was interpreted as an inherent failure of prediction based on animal phenomena. This is contradicted on the basis of reliable Chinese documentation provided by the responsible earthquake study commission. The Tangshan earthquake was preceded by more than 2,000 reported animal anomalies, some of which were of very dramatic nature. They are discussed here. Any physical phenomenon, which may cause animal unrest, must involve energy turnover before the main earthquake

  4. Global earthquake fatalities and population

    USGS Publications Warehouse

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  5. Laboratory investigations of earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Xia, Kaiwen

    In this thesis this will be attempted through controlled laboratory experiments that are designed to mimic natural earthquake scenarios. The earthquake dynamic rupturing process itself is a complicated phenomenon, involving dynamic friction, wave propagation, and heat production. Because controlled experiments can produce results without assumptions needed in theoretical and numerical analysis, the experimental method is thus advantageous over theoretical and numerical methods. Our laboratory fault is composed of carefully cut photoelastic polymer plates (Homahte-100, Polycarbonate) held together by uniaxial compression. As a unique unit of the experimental design, a controlled exploding wire technique provides the triggering mechanism of laboratory earthquakes. Three important components of real earthquakes (i.e., pre-existing fault, tectonic loading, and triggering mechanism) correspond to and are simulated by frictional contact, uniaxial compression, and the exploding wire technique. Dynamic rupturing processes are visualized using the photoelastic method and are recorded via a high-speed camera. Our experimental methodology, which is full-field, in situ, and non-intrusive, has better control and diagnostic capacity compared to other existing experimental methods. Using this experimental approach, we have investigated several problems: dynamics of earthquake faulting occurring along homogeneous faults separating identical materials, earthquake faulting along inhomogeneous faults separating materials with different wave speeds, and earthquake faulting along faults with a finite low wave speed fault core. We have observed supershear ruptures, subRayleigh to supershear rupture transition, crack-like to pulse-like rupture transition, self-healing (Heaton) pulse, and rupture directionality.

  6. A Controllable Earthquake Rupture Experiment on the Homestake Fault

    NASA Astrophysics Data System (ADS)

    Germanovich, L. N.; Murdoch, L. C.; Garagash, D.; Reches, Z.; Martel, S. J.; Gwaba, D.; Elsworth, D.; Lowell, R. P.; Onstott, T. C.

    2010-12-01

    Fault-slip is typically simulated in the laboratory at the cm-to-dm scale. Laboratory results are then up-scaled by orders of magnitude to understand faulting and earthquakes processes. We suggest an experimental approach to reactivate faults in-situ at scales ~10-100 m using thermal techniques and fluid injection to modify in situ stresses and the fault strength to the point where the rock fails. Mines where the modified in-situ stresses are sufficient to drive faulting, present an opportunity to conduct such experiments. During our recent field work in the former Homestake gold mine in the northern Black Hills, South Dakota, we found a large fault present on multiple mine levels. The fault is subparallel to the local foliation in the Poorman formation, a Proterozoic metamorphic rock deformed into regional-scale folds with axes plunging ~40° to the SSE. The fault extends at least 1.5 km along strike and dip, with a center ~1.5 km deep. It strikes ~320-340° N, dips ~45-70° NE, and is recognized by a ~0.3-0.5 m thick distinct gouge that contains crushed host rock and black material that appears to be graphite. Although we could not find clear evidence for fault displacement, secondary features suggest that it is a normal fault. The size and distinct structure of this fault make it a promising target for in-situ experimentation of fault strength, hydrological properties, and slip nucleation processes. Most earthquakes are thought to be the result of unstable slip on existing faults, Activation of the Homestake fault in response to the controlled fluid injection and thermally changing background stresses is likely to be localized on a crack-like patch. Slow patch propagation, moderated by the injection rate and the rate of change of the background stresses, may become unstable, leading to the nucleation of a small earthquake (dynamic) rupture. This controlled instability is intimately related to the dependence of the fault strength on the slip process and has been

  7. Earthquake number forecasts testing

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  8. Measuring the effectiveness of earthquake forecasting in insurance strategies

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  9. Long-Term Impact of Earthquakes on Sleep Quality

    PubMed Central

    Tempesta, Daniela; Curcio, Giuseppe; De Gennaro, Luigi; Ferrara, Michele

    2013-01-01

    Purpose We investigated the impact of the 6.3 magnitude 2009 L’Aquila (Italy) earthquake on standardized self-report measures of sleep quality (Pittsburgh Sleep Quality Index, PSQI) and frequency of disruptive nocturnal behaviours (Pittsburgh Sleep Quality Index-Addendum, PSQI-A) two years after the natural disaster. Methods Self-reported sleep quality was assessed in 665 L’Aquila citizens exposed to the earthquake compared with a different sample (n = 754) of L'Aquila citizens tested 24 months before the earthquake. In addition, sleep quality and disruptive nocturnal behaviours (DNB) of people exposed to the traumatic experience were compared with people that in the same period lived in different areas ranging between 40 and 115 km from the earthquake epicenter (n = 3574). Results The comparison between L’Aquila citizens before and after the earthquake showed a significant deterioration of sleep quality after the exposure to the trauma. In addition, two years after the earthquake L'Aquila citizens showed the highest PSQI scores and the highest incidence of DNB compared to subjects living in the surroundings. Interestingly, above-the-threshold PSQI scores were found in the participants living within 70 km from the epicenter, while trauma-related DNBs were found in people living in a range of 40 km. Multiple regressions confirmed that proximity to the epicenter is predictive of sleep disturbances and DNB, also suggesting a possible mediating effect of depression on PSQI scores. Conclusions The psychological effects of an earthquake may be much more pervasive and long-lasting of its building destruction, lasting for years and involving a much larger population. A reduced sleep quality and an increased frequency of DNB after two years may be a risk factor for the development of depression and posttraumatic stress disorder. PMID:23418478

  10. Long-term impact of earthquakes on sleep quality.

    PubMed

    Tempesta, Daniela; Curcio, Giuseppe; De Gennaro, Luigi; Ferrara, Michele

    2013-01-01

    We investigated the impact of the 6.3 magnitude 2009 L'Aquila (Italy) earthquake on standardized self-report measures of sleep quality (Pittsburgh Sleep Quality Index, PSQI) and frequency of disruptive nocturnal behaviours (Pittsburgh Sleep Quality Index-Addendum, PSQI-A) two years after the natural disaster. Self-reported sleep quality was assessed in 665 L'Aquila citizens exposed to the earthquake compared with a different sample (n = 754) of L'Aquila citizens tested 24 months before the earthquake. In addition, sleep quality and disruptive nocturnal behaviours (DNB) of people exposed to the traumatic experience were compared with people that in the same period lived in different areas ranging between 40 and 115 km from the earthquake epicenter (n = 3574). The comparison between L'Aquila citizens before and after the earthquake showed a significant deterioration of sleep quality after the exposure to the trauma. In addition, two years after the earthquake L'Aquila citizens showed the highest PSQI scores and the highest incidence of DNB compared to subjects living in the surroundings. Interestingly, above-the-threshold PSQI scores were found in the participants living within 70 km from the epicenter, while trauma-related DNBs were found in people living in a range of 40 km. Multiple regressions confirmed that proximity to the epicenter is predictive of sleep disturbances and DNB, also suggesting a possible mediating effect of depression on PSQI scores. The psychological effects of an earthquake may be much more pervasive and long-lasting of its building destruction, lasting for years and involving a much larger population. A reduced sleep quality and an increased frequency of DNB after two years may be a risk factor for the development of depression and posttraumatic stress disorder.

  11. Testing new methodologies for short -term earthquake forecasting: Multi-parameters precursors

    NASA Astrophysics Data System (ADS)

    Ouzounov, Dimitar; Pulinets, Sergey; Tramutoli, Valerio; Lee, Lou; Liu, Tiger; Hattori, Katsumi; Kafatos, Menas

    2014-05-01

    We are conducting real-time tests involving multi-parameter observations over different seismo-tectonics regions in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several selected parameters, namely: gas discharge; thermal infrared radiation; ionospheric electron density; and atmospheric temperature and humidity, which we believe are all associated with the earthquake preparation phase. We are testing a methodology capable to produce alerts in advance of major earthquakes (M > 5.5) in different regions of active earthquakes and volcanoes. During 2012-2013 we established a collaborative framework with PRE-EARTHQUAKE (EU) and iSTEP3 (Taiwan) projects for coordinated measurements and prospective validation over seven testing regions: Southern California (USA), Eastern Honshu (Japan), Italy, Greece, Turkey, Taiwan (ROC), Kamchatka and Sakhalin (Russia). The current experiment provided a "stress test" opportunity to validate the physical based earthquake precursor approach over regions of high seismicity. Our initial results are: (1) Real-time tests have shown the presence of anomalies in the atmosphere and ionosphere before most of the significant (M>5.5) earthquakes; (2) False positives exist and ratios are different for each region, varying between 50% for (Southern Italy), 35% (California) down to 25% (Taiwan, Kamchatka and Japan) with a significant reduction of false positives as soon as at least two geophysical parameters are contemporarily used; (3) Main problems remain related to the systematic collection and real-time integration of pre-earthquake observations. Our findings suggest that real-time testing of physically based pre-earthquake signals provides a short-term predictive power (in all three important parameters, namely location, time and magnitude) for the occurrence of major earthquakes in the tested regions and this result encourages testing to continue with a more detailed analysis of

  12. Real data assimilation for optimization of frictional parameters and prediction of afterslip in the 2003 Tokachi-oki earthquake inferred from slip velocity by an adjoint method

    NASA Astrophysics Data System (ADS)

    Kano, Masayuki; Miyazaki, Shin'ichi; Ishikawa, Yoichi; Hiyoshi, Yoshihisa; Ito, Kosuke; Hirahara, Kazuro

    2015-10-01

    Data assimilation is a technique that optimizes the parameters used in a numerical model with a constraint of model dynamics achieving the better fit to observations. Optimized parameters can be utilized for the subsequent prediction with a numerical model and predicted physical variables are presumably closer to observations that will be available in the future, at least, comparing to those obtained without the optimization through data assimilation. In this work, an adjoint data assimilation system is developed for optimizing a relatively large number of spatially inhomogeneous frictional parameters during the afterslip period in which the physical constraints are a quasi-dynamic equation of motion and a laboratory derived rate and state dependent friction law that describe the temporal evolution of slip velocity at subduction zones. The observed variable is estimated slip velocity on the plate interface. Before applying this method to the real data assimilation for the afterslip of the 2003 Tokachi-oki earthquake, a synthetic data assimilation experiment is conducted to examine the feasibility of optimizing the frictional parameters in the afterslip area. It is confirmed that the current system is capable of optimizing the frictional parameters A-B, A and L by adopting the physical constraint based on a numerical model if observations capture the acceleration and decaying phases of slip on the plate interface. On the other hand, it is unlikely to constrain the frictional parameters in the region where the amplitude of afterslip is less than 1.0 cm d-1. Next, real data assimilation for the 2003 Tokachi-oki earthquake is conducted to incorporate slip velocity data inferred from time dependent inversion of Global Navigation Satellite System time-series. The optimized values of A-B, A and L are O(10 kPa), O(102 kPa) and O(10 mm), respectively. The optimized frictional parameters yield the better fit to the observations and the better prediction skill of slip

  13. Two grave issues concerning the expected Tokai Earthquake

    NASA Astrophysics Data System (ADS)

    Mogi, K.

    2004-08-01

    The possibility of a great shallow earthquake (M 8) in the Tokai region, central Honshu, in the near future was pointed out by Mogi in 1969 and by the Coordinating Committee for Earthquake Prediction (CCEP), Japan (1970). In 1978, the government enacted the Large-Scale Earthquake Countermeasures Law and began to set up intensified observations in this region for short-term prediction of the expected Tokai earthquake. In this paper, two serious issues are pointed out, which may contribute to catastrophic effects in connection with the Tokai earthquake: 1. The danger of black-and-white predictions: According to the scenario based on the Large-Scale Earthquake Countermeasures Law, if abnormal crustal changes are observed, the Earthquake Assessment Committee (EAC) will determine whether or not there is an imminent danger. The findings are reported to the Prime Minister who decides whether to issue an official warning statement. Administrative policy clearly stipulates the measures to be taken in response to such a warning, and because the law presupposes the ability to predict a large earthquake accurately, there are drastic measures appropriate to the situation. The Tokai region is a densely populated region with high social and economic activity, and it is traversed by several vital transportation arteries. When a warning statement is issued, all transportation is to be halted. The Tokyo capital region would be cut off from the Nagoya and Osaka regions, and there would be a great impact on all of Japan. I (the former chairman of EAC) maintained that in view of the variety and complexity of precursory phenomena, it was inadvisable to attempt a black-and-white judgment as the basis for a "warning statement". I urged that the government adopt a "soft warning" system that acknowledges the uncertainty factor and that countermeasures be designed with that uncertainty in mind. 2. The danger of nuclear power plants in the focal region: Although the possibility of the

  14. Application of a long-range forecasting model to earthquakes in the Japan mainland testing region

    NASA Astrophysics Data System (ADS)

    Rhoades, David A.

    2011-03-01

    The Every Earthquake a Precursor According to Scale (EEPAS) model is a long-range forecasting method which has been previously applied to a number of regions, including Japan. The Collaboratory for the Study of Earthquake Predictability (CSEP) forecasting experiment in Japan provides an opportunity to test the model at lower magnitudes than previously and to compare it with other competing models. The model sums contributions to the rate density from past earthquakes based on predictive scaling relations derived from the precursory scale increase phenomenon. Two features of the earthquake catalogue in the Japan mainland region create difficulties in applying the model, namely magnitude-dependence in the proportion of aftershocks and in the Gutenberg-Richter b-value. To accommodate these features, the model was fitted separately to earthquakes in three different target magnitude classes over the period 2000-2009. There are some substantial unexplained differences in parameters between classes, but the time and magnitude distributions of the individual earthquake contributions are such that the model is suitable for three-month testing at M ≥ 4 and for one-year testing at M ≥ 5. In retrospective analyses, the mean probability gain of the EEPAS model over a spatially smoothed seismicity model increases with magnitude. The same trend is expected in prospective testing. The Proximity to Past Earthquakes (PPE) model has been submitted to the same testing classes as the EEPAS model. Its role is that of a spatially-smoothed reference model, against which the performance of time-varying models can be compared.

  15. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  16. A new scoring method for evaluating the performance of earthquake forecasts and predictions

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2009-12-01

    This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when

  17. Earthquakes triggered by fluid extraction

    USGS Publications Warehouse

    Segall, P.

    1989-01-01

    Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author

  18. FORECAST MODEL FOR MODERATE EARTHQUAKES NEAR PARKFIELD, CALIFORNIA.

    USGS Publications Warehouse

    Stuart, William D.; Archuleta, Ralph J.; Lindh, Allan G.

    1985-01-01

    The paper outlines a procedure for using an earthquake instability model and repeated geodetic measurements to attempt an earthquake forecast. The procedure differs from other prediction methods, such as recognizing trends in data or assuming failure at a critical stress level, by using a self-contained instability model that simulates both preseismic and coseismic faulting in a natural way. In short, physical theory supplies a family of curves, and the field data select the member curves whose continuation into the future constitutes a prediction. Model inaccuracy and resolving power of the data determine the uncertainty of the selected curves and hence the uncertainty of the earthquake time.

  19. Earthquake chemical precursors in groundwater: a review

    NASA Astrophysics Data System (ADS)

    Paudel, Shukra Raj; Banjara, Sushant Prasad; Wagle, Amrita; Freund, Friedemann T.

    2018-03-01

    We review changes in groundwater chemistry as precursory signs for earthquakes. In particular, we discuss pH, total dissolved solids (TDS), electrical conductivity, and dissolved gases in relation to their significance for earthquake prediction or forecasting. These parameters are widely believed to vary in response to seismic and pre-seismic activity. However, the same parameters also vary in response to non-seismic processes. The inability to reliably distinguish between changes caused by seismic or pre-seismic activities from changes caused by non-seismic activities has impeded progress in earthquake science. Short-term earthquake prediction is unlikely to be achieved, however, by pH, TDS, electrical conductivity, and dissolved gas measurements alone. On the other hand, the production of free hydroxyl radicals (•OH), subsequent reactions such as formation of H2O2 and oxidation of As(III) to As(V) in groundwater, have distinctive precursory characteristics. This study deviates from the prevailing mechanical mantra. It addresses earthquake-related non-seismic mechanisms, but focused on the stress-induced electrification of rocks, the generation of positive hole charge carriers and their long-distance propagation through the rock column, plus on electrochemical processes at the rock-water interface.

  20. A Comparison of Geodetic and Geologic Rates Prior to Large Strike-Slip Earthquakes: A Diversity of Earthquake-Cycle Behaviors?

    NASA Astrophysics Data System (ADS)

    Dolan, James F.; Meade, Brendan J.

    2017-12-01

    Comparison of preevent geodetic and geologic rates in three large-magnitude (Mw = 7.6-7.9) strike-slip earthquakes reveals a wide range of behaviors. Specifically, geodetic rates of 26-28 mm/yr for the North Anatolian fault along the 1999 MW = 7.6 Izmit rupture are ˜40% faster than Holocene geologic rates. In contrast, geodetic rates of ˜6-8 mm/yr along the Denali fault prior to the 2002 MW = 7.9 Denali earthquake are only approximately half as fast as the latest Pleistocene-Holocene geologic rate of ˜12 mm/yr. In the third example where a sufficiently long pre-earthquake geodetic time series exists, the geodetic and geologic rates along the 2001 MW = 7.8 Kokoxili rupture on the Kunlun fault are approximately equal at ˜11 mm/yr. These results are not readily explicable with extant earthquake-cycle modeling, suggesting that they may instead be due to some combination of regional kinematic fault interactions, temporal variations in the strength of lithospheric-scale shear zones, and/or variations in local relative plate motion rate. Whatever the exact causes of these variable behaviors, these observations indicate that either the ratio of geodetic to geologic rates before an earthquake may not be diagnostic of the time to the next earthquake, as predicted by many rheologically based geodynamic models of earthquake-cycle behavior, or different behaviors characterize different fault systems in a manner that is not yet understood or predictable.

  1. Statistical aspects and risks of human-caused earthquakes

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2013-12-01

    The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture.

  2. The physics of an earthquake

    NASA Astrophysics Data System (ADS)

    McCloskey, John

    2008-03-01

    The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.

  3. Possible seasonality in large deep-focus earthquakes

    NASA Astrophysics Data System (ADS)

    Zhan, Zhongwen; Shearer, Peter M.

    2015-09-01

    Large deep-focus earthquakes (magnitude > 7.0, depth > 500 km) have exhibited strong seasonality in their occurrence times since the beginning of global earthquake catalogs. Of 60 such events from 1900 to the present, 42 have occurred in the middle half of each year. The seasonality appears strongest in the northwest Pacific subduction zones and weakest in the Tonga region. Taken at face value, the surplus of northern hemisphere summer events is statistically significant, but due to the ex post facto hypothesis testing, the absence of seasonality in smaller deep earthquakes, and the lack of a known physical triggering mechanism, we cannot rule out that the observed seasonality is just random chance. However, we can make a testable prediction of seasonality in future large deep-focus earthquakes, which, given likely earthquake occurrence rates, should be verified or falsified within a few decades. If confirmed, deep earthquake seasonality would challenge our current understanding of deep earthquakes.

  4. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  5. The influence of one earthquake on another

    NASA Astrophysics Data System (ADS)

    Kilb, Deborah Lyman

    1999-12-01

    Part one of my dissertation examines the initiation of earthquake rupture. We study the initial subevent (ISE) of the Mw 6.7 1994 Northridge, California earthquake to distinguish between two end-member hypotheses of an organized and predictable earthquake rupture initiation process or, alternatively, a random process. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both end-member models, and do not allow us to distinguish between them. However, further tests show the ISE's waveform characteristics are similar to those of typical nearby small earthquakes (i.e., dynamic ruptures). The second part of my dissertation examines aftershocks of the M 7.1 1989 Loma Prieta, California earthquake to determine if theoretical models of static Coulomb stress changes correctly predict the fault plane geometries and slip directions of Loma Prieta aftershocks. Our work shows individual aftershock mechanisms cannot be successfully predicted because a similar degree of predictability can be obtained using a randomized catalogue. This result is probably a function of combined errors in the models of mainshock slip distribution, background stress field, and aftershock locations. In the final part of my dissertation, we test the idea that earthquake triggering occurs when properties of a fault and/or its loading are modified by Coulomb failure stress changes that may be transient and oscillatory (i.e., dynamic) or permanent (i.e., static). We propose a triggering threshold failure stress change exists, above which the earthquake nucleation process begins although failure need not occur instantaneously. We test these ideas using data from the 1992 M 7.4 Landers earthquake and its aftershocks. Stress changes can be categorized as either dynamic (generated during the passage of seismic waves), static (associated with permanent fault offsets

  6. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  7. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  8. Frictional stability and earthquake triggering during fluid pressure stimulation of an experimental fault

    NASA Astrophysics Data System (ADS)

    Scuderi, M. M.; Collettini, C.; Marone, C.

    2017-11-01

    It is widely recognized that the significant increase of M > 3.0 earthquakes in Western Canada and the Central United States is related to underground fluid injection. Following injection, fluid overpressure lubricates the fault and reduces the effective normal stress that holds the fault in place, promoting slip. Although, this basic physical mechanism for earthquake triggering and fault slip is well understood, there are many open questions related to induced seismicity. Models of earthquake nucleation based on rate- and state-friction predict that fluid overpressure should stabilize fault slip rather than trigger earthquakes. To address this controversy, we conducted laboratory creep experiments to monitor fault slip evolution at constant shear stress while the effective normal stress was systematically reduced via increasing fluid pressure. We sheared layers of carbonate-bearing fault gouge in a double direct shear configuration within a true-triaxial pressure vessel. We show that fault slip evolution is controlled by the stress state acting on the fault and that fluid pressurization can trigger dynamic instability even in cases of rate strengthening friction, which should favor aseismic creep. During fluid pressurization, when shear and effective normal stresses reach the failure condition, accelerated creep occurs in association with fault dilation; further pressurization leads to an exponential acceleration with fault compaction and slip localization. Our work indicates that fault weakening induced by fluid pressurization can overcome rate strengthening friction resulting in fast acceleration and earthquake slip. Our work points to modifications of the standard model for earthquake nucleation to account for the effect of fluid overpressure and to accurately predict the seismic risk associated with fluid injection.

  9. The next new Madrid earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, W.

    1988-01-01

    Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less

  10. If pandas scream. an earthquake is coming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magida, P.

    Feature article:Use of the behavior of animals to predict weather has spanned several ages and dozens of countries. While animals may behave in diverse ways to indicate weather changes, they all tend to behave in more or less the same way before earthquakes. The geophysical community in the U.S. has begun testing animal behavior before earthquakes. It has been determined that animals have the potential of acting as accurate geosensors to detect earthquakes before they occur. (5 drawings)

  11. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  12. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  13. Parallelization of the Coupled Earthquake Model

    NASA Technical Reports Server (NTRS)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  14. The California Earthquake Advisory Plan: A history

    USGS Publications Warehouse

    Roeloffs, Evelyn A.; Goltz, James D.

    2017-01-01

    Since 1985, the California Office of Emergency Services (Cal OES) has issued advisory statements to local jurisdictions and the public following seismic activity that scientists on the California Earthquake Prediction Evaluation Council view as indicating elevated probability of a larger earthquake in the same area during the next several days. These advisory statements are motivated by statistical studies showing that about 5% of moderate earthquakes in California are followed by larger events within a 10-km, five-day space-time window (Jones, 1985; Agnew and Jones, 1991; Reasenberg and Jones, 1994). Cal OES issued four earthquake advisories from 1985 to 1989. In October, 1990, the California Earthquake Advisory Plan formalized this practice, and six Cal OES Advisories have been issued since then. This article describes that protocol’s scientific basis and evolution.

  15. Implications of next generation attenuation ground motion prediction equations for site coefficients used in earthquake resistant design

    USGS Publications Warehouse

    Borcherdt, Roger D.

    2014-01-01

    Proposals are developed to update Tables 11.4-1 and 11.4-2 of Minimum Design Loads for Buildings and Other Structures published as American Society of Civil Engineers Structural Engineering Institute standard 7-10 (ASCE/SEI 7–10). The updates are mean next generation attenuation (NGA) site coefficients inferred directly from the four NGA ground motion prediction equations used to derive the maximum considered earthquake response maps adopted in ASCE/SEI 7–10. Proposals include the recommendation to use straight-line interpolation to infer site coefficients at intermediate values of (average shear velocity to 30-m depth). The NGA coefficients are shown to agree well with adopted site coefficients at low levels of input motion (0.1 g) and those observed from the Loma Prieta earthquake. For higher levels of input motion, the majority of the adopted values are within the 95% epistemic-uncertainty limits implied by the NGA estimates with the exceptions being the mid-period site coefficient, Fv, for site class D and the short-period coefficient, Fa, for site class C, both of which are slightly less than the corresponding 95% limit. The NGA data base shows that the median value  of 913 m/s for site class B is more typical than 760 m/s as a value to characterize firm to hard rock sites as the uniform ground condition for future maximum considered earthquake response ground motion estimates. Future updates of NGA ground motion prediction equations can be incorporated easily into future adjustments of adopted site coefficients using procedures presented herein. 

  16. Laboratory generated M -6 earthquakes

    USGS Publications Warehouse

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  17. Reproductive health and access to healthcare facilities: risk factors for depression and anxiety in women with an earthquake experience

    PubMed Central

    2011-01-01

    Background The reproductive and mental health of women contributes significantly to their overall well-being. Three of the eight Millennium Development Goals are directly related to reproductive and sexual health while mental disorders make up three of the ten leading causes of disease burden in low and middle-income countries. Among mental disorders, depression and anxiety are two of the most prevalent. In the context of slower progress in achieving Millennium Development Goals in developing countries and the ever-increasing man-made and natural disasters in these areas, it is important to understand the association between reproductive health and mental health among women with post-disaster experiences. Methods This was a cross-sectional study with a sample of 387 women of reproductive age (15-49 years) randomly selected from the October 2005 earthquake affected areas of Pakistan. Data on reproductive health was collected using the Centers for Disease Control reproductive health assessment toolkit. Depression and anxiety were measured using the Hopkins Symptom Checklist-25, while earthquake experiences were captured using the Harvard Trauma Questionnaire. The association of either depression or anxiety with socio-demographic variables, earthquake experiences, reproductive health and access to health facilities was estimated using multivariate logistic regression. Results Post-earthquake reproductive health events together with economic deprivation, lower family support and poorer access to health care facilities explained a significant proportion of differences in the experiencing of clinical levels of depression and anxiety. For instance, women losing resources for subsistence, separation from family and experiencing reproductive health events such as having a stillbirth, having had an abortion, having had abnormal vaginal discharge or having had genital ulcers, were at significant risk of depression and anxiety. Conclusion The relationship between women's post-earthquake

  18. Reproductive health and access to healthcare facilities: risk factors for depression and anxiety in women with an earthquake experience.

    PubMed

    Anwar, Jasim; Mpofu, Elias; Matthews, Lynda R; Shadoul, Ahmed Farah; Brock, Kaye E

    2011-06-30

    The reproductive and mental health of women contributes significantly to their overall well-being. Three of the eight Millennium Development Goals are directly related to reproductive and sexual health while mental disorders make up three of the ten leading causes of disease burden in low and middle-income countries. Among mental disorders, depression and anxiety are two of the most prevalent. In the context of slower progress in achieving Millennium Development Goals in developing countries and the ever-increasing man-made and natural disasters in these areas, it is important to understand the association between reproductive health and mental health among women with post-disaster experiences. This was a cross-sectional study with a sample of 387 women of reproductive age (15-49 years) randomly selected from the October 2005 earthquake affected areas of Pakistan. Data on reproductive health was collected using the Centers for Disease Control reproductive health assessment toolkit. Depression and anxiety were measured using the Hopkins Symptom Checklist-25, while earthquake experiences were captured using the Harvard Trauma Questionnaire. The association of either depression or anxiety with socio-demographic variables, earthquake experiences, reproductive health and access to health facilities was estimated using multivariate logistic regression. Post-earthquake reproductive health events together with economic deprivation, lower family support and poorer access to health care facilities explained a significant proportion of differences in the experiencing of clinical levels of depression and anxiety. For instance, women losing resources for subsistence, separation from family and experiencing reproductive health events such as having a stillbirth, having had an abortion, having had abnormal vaginal discharge or having had genital ulcers, were at significant risk of depression and anxiety. The relationship between women's post-earthquake mental health and

  19. Teacher Guidelines for Helping Students after an Earthquake

    ERIC Educational Resources Information Center

    National Child Traumatic Stress Network, 2013

    2013-01-01

    Being in an earthquake is very frightening, and the days, weeks, and months following are very stressful. Most families recover over time, especially with the support of relatives, friends, and their community. But different families may have different experiences during and after the earthquake, including the experience of aftershocks which may…

  20. An interdisciplinary approach to study Pre-Earthquake processes

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Pulinets, S. A.; Hattori, K.; Taylor, P. T.

    2017-12-01

    We will summarize a multi-year research effort on wide-ranging observations of pre-earthquake processes. Based on space and ground data we present some new results relevant to the existence of pre-earthquake signals. Over the past 15-20 years there has been a major revival of interest in pre-earthquake studies in Japan, Russia, China, EU, Taiwan and elsewhere. Recent large magnitude earthquakes in Asia and Europe have shown the importance of these various studies in the search for earthquake precursors either for forecasting or predictions. Some new results were obtained from modeling of the atmosphere-ionosphere connection and analyses of seismic records (foreshocks /aftershocks), geochemical, electromagnetic, and thermodynamic processes related to stress changes in the lithosphere, along with their statistical and physical validation. This cross - disciplinary approach could make an impact on our further understanding of the physics of earthquakes and the phenomena that precedes their energy release. We also present the potential impact of these interdisciplinary studies to earthquake predictability. A detail summary of our approach and that of several international researchers will be part of this session and will be subsequently published in a new AGU/Wiley volume. This book is part of the Geophysical Monograph series and is intended to show the variety of parameters seismic, atmospheric, geochemical and historical involved is this important field of research and will bring this knowledge and awareness to a broader geosciences community.

  1. Development of the Japanese National Disaster Medical System and Experiences during the Great East Japan Earthquake

    PubMed Central

    Homma, Masato

    2015-01-01

    After the Great Hanshin-Awaji Earthquake in 1995, the Japanese national disaster medical system (NDMS) was developed. It mainly consists of four components, namely, a disaster base hospital, an emergency medical information system, a disaster medical assistance team (DMAT), and national aeromedical evacuation (AE). The NDMS was tested for the first time in a real disaster situation during the Great East Japan Earthquake in 2011. Two airports and one base were appointed as DMAT gathering places, and approximately 393 DMAT members divided into 78 teams were transported by Japan Air Self-Defense Force (JASDF) aircrafts to two AE staging bases the following day. Staging care units were installed at Hanamaki Airport, Fukushima Airport, and the Japan Ground Self-Defense Force Camp Kasuminome, and 69, 14 and 24 DMAT teams were placed at those locations, respectively. In total, 19 patients were evacuated using JASDF fixed-wing aircraft. Important issues requiring attention became clear through the experiences of the Great East Japan Earthquake and will be discussed in this paper. PMID:26306054

  2. Real-time neural network earthquake profile predictor

    DOEpatents

    Leach, R.R.; Dowla, F.U.

    1996-02-06

    A neural network has been developed that uses first-arrival energy to predict the characteristics of impending earthquake seismograph signals. The propagation of ground motion energy through the earth is a highly nonlinear function. This is due to different forms of ground motion as well as to changes in the elastic properties of the media throughout the propagation path. The neural network is trained using seismogram data from earthquakes. Presented with a previously unseen earthquake, the neural network produces a profile of the complete earthquake signal using data from the first seconds of the signal. This offers a significant advance in the real-time monitoring, warning, and subsequent hazard minimization of catastrophic ground motion. 17 figs.

  3. Real-time neural network earthquake profile predictor

    DOEpatents

    Leach, Richard R.; Dowla, Farid U.

    1996-01-01

    A neural network has been developed that uses first-arrival energy to predict the characteristics of impending earthquake seismograph signals. The propagation of ground motion energy through the earth is a highly nonlinear function. This is due to different forms of ground motion as well as to changes in the elastic properties of the media throughout the propagation path. The neural network is trained using seismogram data from earthquakes. Presented with a previously unseen earthquake, the neural network produces a profile of the complete earthquake signal using data from the first seconds of the signal. This offers a significant advance in the real-time monitoring, warning, and subsequent hazard minimization of catastrophic ground motion.

  4. Implications for earthquake risk reduction in the United States from the Kocaeli, Turkey, earthquake of August 17, 1999

    USGS Publications Warehouse

    ,

    2000-01-01

    This report documents implications for earthquake risk reduction in the U.S. The magnitude 7.4 earthquake caused 17,127 deaths, 43,953 injuries, and displaced more than 250,000 people from their homes. The report warns that similar disasters are possible in the United States where earthquakes of comparable size strike the heart of American urban areas. Another concern described in the report is the delayed emergency response that was caused by the inadequate seismic monitoring system in Turkey, a problem that contrasts sharply with rapid assessment and response to the September Chi-Chi earthquake in Taiwan. Additionally, the experience in Turkey suggests that techniques for forecasting earthquakes may be improving.

  5. An atlas of ShakeMaps for selected global earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul S.; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  6. Testimonies to the L'Aquila earthquake (2009) and to the L'Aquila process

    NASA Astrophysics Data System (ADS)

    Kalenda, Pavel; Nemec, Vaclav

    2014-05-01

    Lot of confusions, misinformation, false solidarity, efforts to misuse geoethics and other unethical activities in favour of the top Italian seismologists responsible for a bad and superficial evaluation of the situation 6 days prior to the earthquake - that is a general characteristics for the whole period of 5 years separating us from the horrible morning of April 6, 2009 in L'Aquila with 309 human victims. The first author of this presentation as a seismologist had unusual opportunity to visit the unfortunate city in April 2009. He got all "first-hand" information that a real scientifically based prediction did exist already for some shocks in the area on March 29 and 30, 2009. The author of the prediction Gianpaolo Giuliani was obliged to stop any public information diffused by means of internet. A new prediction was known to him on March 31 - in the day when the "Commission of Great Risks" offered a public assurance that any immediate earthquake can be practically excluded. In reality the members of the commission completely ignored such a prediction declaring it as a false alarm of "somebody" (even without using the name of Giuliani). The observations by Giuliani were of high quality from the scientific point of view. G. Giuliani predicted L'Aquila earthquake in the professional way - for the first time during many years of observations. The anomalies, which preceded L'Aquila earthquake were detected on many places in Europe in the same time. The question is, what locality would be signed as potential focal area, if G. Giuliani would know the other observations in Europe. The deformation (and other) anomalies are observable before almost all of global M8 earthquakes. Earthquakes are preceded by deformation and are predictable. The testimony of the second author is based on many unfortunate personal experiences with representatives of the INGV Rome and their supporters from India and even Australia. In July 2010, prosecutor Fabio Picuti charged the Commission

  7. Dynamic stress changes during earthquake rupture

    USGS Publications Warehouse

    Day, S.M.; Yu, G.; Wald, D.J.

    1998-01-01

    We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.

  8. U.S.-Japan Quake Prediction Research

    NASA Astrophysics Data System (ADS)

    Kisslinger, Carl; Mikumo, Takeshi; Kanamori, Hiroo

    For the seventh time since 1964, a seminar on earthquake prediction has been convened under the U.S.-Japan Cooperation in Science Program. The purpose of the seminar was to provide an opportunity for researchers from the two countries to share recent progress and future plans in the continuing effort to develop the scientific basis for predicting earthquakes and practical means for implementing prediction technology as it emerges. Thirty-six contributors, 15 from Japan and 21 from the U.S., met in Morro Bay, Calif.September 12-14. The following day they traveled to nearby sections of the San Andreas fault, including the site of the Parkfield prediction experiment. The conveners of the seminar were Hiroo Kanamori, Seismological Laboratory, California Institute of Technology (Caltech), for the U.S., and Takeshi Mikumo, Disaster Prevention Research Institute, Kyoto University, for Japan . Funding for the participants came from the U.S. National Science Foundation and the Japan Society forthe Promotion of Science, supplemented by other agencies in both countries.

  9. A landslide susceptibility prediction on a sample slope in Kathmandu Nepal associated with the 2015's Gorkha Earthquake

    NASA Astrophysics Data System (ADS)

    Kubota, Tetsuya; Prasad Paudel, Prem

    2016-04-01

    In 2013, some landslides induced by heavy rainfalls occurred in southern part of Kathmandu, Nepal which is located southern suburb of Kathmandu, the capital. These landslide slopes hit by the strong Gorkha Earthquake in April 2015 and seemed to destabilize again. Hereby, to clarify their susceptibility of landslide in the earthquake, one of these landslide slopes was analyzed its slope stability by CSSDP (Critical Slip Surface analysis by Dynamic Programming based on limit equilibrium method, especially Janbu method) against slope failure with various seismic acceleration observed around Kathmandu in the Gorkha Earthquake. The CSSDP can detect the landslide slip surface which has minimum Fs (factor of safety) automatically using dynamic programming theory. The geology in this area mainly consists of fragile schist and it is prone to landslide occurrence. Field survey was conducted to obtain topological data such as ground surface and slip surface cross section. Soil parameters obtained by geotechnical tests with field sampling were applied. Consequently, the slope has distinctive characteristics followings in terms of slope stability: (1) With heavy rainfall, it collapsed and had a factor of safety Fs <1.0 (0.654 or more). (2) With seismic acceleration of 0.15G (147gal) observed around Kathmandu, it has Fs=1.34. (3) With possible local seismic acceleration of 0.35G (343gal) estimated at Kathmandu, it has Fs=0.989. If it were very shallow landslide and covered with cedars, it could have Fs =1.055 due to root reinforcement effect to the soil strength. (4) Without seismic acceleration and with no rainfall condition, it has Fs=1.75. These results can explain the real landslide occurrence in this area with the maximum seismic acceleration estimated as 0.15G in the vicinity of Kathmandu by the Gorkha Earthquake. Therefore, these results indicate landslide susceptibility of the slopes in this area with strong earthquake. In this situation, it is possible to predict

  10. Media exposure related to the 2008 Sichuan Earthquake predicted probable PTSD among Chinese adolescents in Kunming, China: A longitudinal study.

    PubMed

    Yeung, Nelson C Y; Lau, Joseph T F; Yu, Nancy Xiaonan; Zhang, Jianping; Xu, Zhening; Choi, Kai Chow; Zhang, Qi; Mak, Winnie W S; Lui, Wacy W S

    2018-03-01

    This study examined the prevalence and the psychosocial predictors of probable PTSD among Chinese adolescents in Kunming (approximately 444 miles from the epicenter), China, who were indirectly exposed to the Sichuan Earthquake in 2008. Using a longitudinal study design, primary and secondary school students (N = 3577) in Kunming completed questionnaires at baseline (June 2008) and 6 months afterward (December 2008) in classroom settings. Participants' exposure to earthquake-related imagery and content, perceptions and emotional reactions related to the earthquake, and posttraumatic stress symptoms were measured. Univariate and forward stepwise multivariable logistic regression models were fit to identify significant predictors of probable PTSD at the 6-month follow-up. Prevalences of probable PTSD (with a Children's Revised Impact of Event Scale score ≥30) among the participants at baseline and 6-month follow-up were 16.9% and 11.1% respectively. In the multivariable analysis, those who were frequently exposed to distressful imagery had experienced at least two types of negative life events, perceived that teachers were distressed due to the earthquake, believed that the earthquake resulted from damages to the ecosystem, and felt apprehensive and emotionally disturbed due to the earthquake reported a higher risk of probable PTSD at 6-month follow-up (all ps < .05). Exposure to distressful media images, emotional responses, and disaster-related perceptions at baseline were found to be predictive of probable PTSD several months after indirect exposure to the event. Parents, teachers, and the mass media should be aware of the negative impacts of disaster-related media exposure on adolescents' psychological health. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Medical experience of a university hospital in Turkey after the 1999 Marmara earthquake

    PubMed Central

    Bulut, M; Fedakar, R; Akkose, S; Akgoz, S; Ozguc, H; Tokyay, R

    2005-01-01

    Objectives: This study aimed to provide an overview of morbidity and mortality among patients admitted to the Hospital of the Medicine Faculty of Uludag University, Bursa, Turkey, after the 1999 Marmara earthquake. Methods: Retrospective analysis of the medical records of 645 earthquake victims. Patients' demographic data, diagnosis, dispositions, and prognosis were reviewed. Results: A total of 330 patients with earthquake related injuries and illness admitted to our hospital were included and divided into three main groups: crush syndrome (n = 110), vital organ injuries (n = 57), and non-traumatic but earthquake related illness (n = 55). Seventy seven per cent of patients were hospitalised during the first three days after the earthquake. The rate of mortality associated with the crush syndrome, vital organ injury, and non-traumatic medical problems was 21% (23/110), 17.5% (10/57), and 9% (5/55), respectively. The overall mortality rate was 8% (50/645). Conclusions: In the first 24–48 hours after a major earthquake, hospital emergency departments are flooded with large numbers of patients. Among this patient load, those patients with crush syndrome or vital organ injuries are particularly at risk. Proper triage and prompt treatment of these seriously injured earthquake victims may decrease morbidity and mortality. It is hoped that this review of the challenges met after the Marmara earthquake and the lessons learned will be of use to emergency department physicians as well as hospital emergency planners in preparing for future natural disasters. PMID:15983085

  12. A seismoacoustic study of the 2011 January 3 Circleville earthquake

    NASA Astrophysics Data System (ADS)

    Arrowsmith, Stephen J.; Burlacu, Relu; Pankow, Kristine; Stump, Brian; Stead, Richard; Whitaker, Rod; Hayward, Chris

    2012-05-01

    We report on a unique set of infrasound observations from a single earthquake, the 2011 January 3 Circleville earthquake (Mw 4.7, depth of 8 km), which was recorded by nine infrasound arrays in Utah. Based on an analysis of the signal arrival times and backazimuths at each array, we find that the infrasound arrivals at six arrays can be associated to the same source and that the source location is consistent with the earthquake epicentre. Results of propagation modelling indicate that the lack of associated arrivals at the remaining three arrays is due to path effects. Based on these findings we form the working hypothesis that the infrasound is generated by body waves causing the epicentral region to pump the atmosphere, akin to a baffled piston. To test this hypothesis, we have developed a numerical seismoacoustic model to simulate the generation of epicentral infrasound from earthquakes. We model the generation of seismic waves using a 3-D finite difference algorithm that accounts for the earthquake moment tensor, source time function, depth and local geology. The resultant acceleration-time histories on a 2-D grid at the surface then provide the initial conditions for modelling the near-field infrasonic pressure wave using the Rayleigh integral. Finally, we propagate the near-field source pressure through the Ground-to-Space atmospheric model using a time-domain Parabolic Equation technique. By comparing the resultant predictions with the six epicentral infrasound observations from the 2011 January 3, Circleville earthquake, we show that the observations agree well with our predictions. The predicted and observed amplitudes are within a factor of 2 (on average, the synthetic amplitudes are a factor of 1.6 larger than the observed amplitudes). In addition, arrivals are predicted at all six arrays where signals are observed, and importantly not predicted at the remaining three arrays. Durations are typically predicted to within a factor of 2, and in some cases

  13. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    NASA Astrophysics Data System (ADS)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    getting a hit is N%" or "the probability of an earthquake is N%" involves specifying the assumptions made. Different plausible assumptions yield a wide range of estimates. In both seismology and sports, how to better predict future performance remains an important question.

  14. Predictability of catastrophic events: Material rupture, earthquakes, turbulence, financial crashes, and human birth

    PubMed Central

    Sornette, Didier

    2002-01-01

    We propose that catastrophic events are “outliers” with statistically different properties than the rest of the population and result from mechanisms involving amplifying critical cascades. We describe a unifying approach for modeling and predicting these catastrophic events or “ruptures,” that is, sudden transitions from a quiescent state to a crisis. Such ruptures involve interactions between structures at many different scales. Applications and the potential for prediction are discussed in relation to the rupture of composite materials, great earthquakes, turbulence, and abrupt changes of weather regimes, financial crashes, and human parturition (birth). Future improvements will involve combining ideas and tools from statistical physics and artificial/computational intelligence, to identify and classify possible universal structures that occur at different scales, and to develop application-specific methodologies to use these structures for prediction of the “crises” known to arise in each application of interest. We live on a planet and in a society with intermittent dynamics rather than a state of equilibrium, and so there is a growing and urgent need to sensitize students and citizens to the importance and impacts of ruptures in their multiple forms. PMID:11875205

  15. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  16. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  17. Real-time forecasting and predictability of catastrophic failure events: from rock failure to volcanoes and earthquakes

    NASA Astrophysics Data System (ADS)

    Main, I. G.; Bell, A. F.; Naylor, M.; Atkinson, M.; Filguera, R.; Meredith, P. G.; Brantut, N.

    2012-12-01

    Accurate prediction of catastrophic brittle failure in rocks and in the Earth presents a significant challenge on theoretical and practical grounds. The governing equations are not known precisely, but are known to produce highly non-linear behavior similar to those of near-critical dynamical systems, with a large and irreducible stochastic component due to material heterogeneity. In a laboratory setting mechanical, hydraulic and rock physical properties are known to change in systematic ways prior to catastrophic failure, often with significant non-Gaussian fluctuations about the mean signal at a given time, for example in the rate of remotely-sensed acoustic emissions. The effectiveness of such signals in real-time forecasting has never been tested before in a controlled laboratory setting, and previous work has often been qualitative in nature, and subject to retrospective selection bias, though it has often been invoked as a basis in forecasting natural hazard events such as volcanoes and earthquakes. Here we describe a collaborative experiment in real-time data assimilation to explore the limits of predictability of rock failure in a best-case scenario. Data are streamed from a remote rock deformation laboratory to a user-friendly portal, where several proposed physical/stochastic models can be analysed in parallel in real time, using a variety of statistical fitting techniques, including least squares regression, maximum likelihood fitting, Markov-chain Monte-Carlo and Bayesian analysis. The results are posted and regularly updated on the web site prior to catastrophic failure, to ensure a true and and verifiable prospective test of forecasting power. Preliminary tests on synthetic data with known non-Gaussian statistics shows how forecasting power is likely to evolve in the live experiments. In general the predicted failure time does converge on the real failure time, illustrating the bias associated with the 'benefit of hindsight' in retrospective analyses

  18. A Trial for Earthquake Prediction by Precise Monitoring of Deep Ground Water Temperature

    NASA Astrophysics Data System (ADS)

    Nasuhara, Y.; Otsuki, K.; Yamauchi, T.

    2006-12-01

    A near future large earthquake is estimated to occur off Miyagi prefecture, northeast Japan within 20 years at a probability of about 80 %. In order to predict this earthquake, we have observed groundwater temperature in a borehole at Sendai city 100 km west of the asperity. This borehole penetrates the fault zone of NE-trending active reverse fault, Nagamachi-Rifu fault zone, at 820m depth. Our concept of the ground water observation is that fault zones are natural amplifier of crustal strain, and hence at 820m depth we set a very precise quartz temperature sensor with the resolution of 0.0002 deg. C. We confirmed our observation system to work normally by both the pumping up tests and the systematic temperature changes at different depths. Since the observation started on June 20 in 2004, we found mysterious intermittent temperature fluctuations of two types; one is of a period of 5-10 days and an amplitude of ca. 0.1 deg. C, and the other is of a period of 11-21 days and an amplitude of ca. 0.2 deg. C. Based on the examination using the product of Grashof number and Prantl number, natural convection of water can be occurred in the borehole. However, since these temperature fluctuations are observed only at the depth around 820 m, thus it is likely that they represent the hydrological natures proper to the Nagamachi-Rifu fault zone. It is noteworthy that the small temperature changes correlatable with earth tide are superposed on the long term and large amplitude fluctuations. The amplitude on the days of the full moon and new moon is ca. 0.001 deg. C. The bottoms of these temperature fluctuations always delay about 6 hours relative to peaks of earth tide. This is interpreted as that water in the borehole is sucked into the fault zone on which tensional normal stress acts on the days of the full moon and new moon. The amplitude of the crustal strain by earth tide was measured at ca. 2∗10^-8 strain near our observation site. High frequency temperature noise of

  19. Critical behavior in earthquake energy dissipation

    NASA Astrophysics Data System (ADS)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  20. Investigation of the relationship between ionospheric foF2 and earthquakes

    NASA Astrophysics Data System (ADS)

    Karaboga, Tuba; Canyilmaz, Murat; Ozcan, Osman

    2018-04-01

    Variations of the ionospheric F2 region critical frequency (foF2) have been investigated statistically before earthquakes during 1980-2008 periods in Japan area. Ionosonde data was taken from Kokubunji station which is in the earthquake preparation zone for all earthquakes. Standard Deviations and Inter-Quartile Range methods are applied to the foF2 data. It is observed that there are anomalous variations in foF2 before earthquakes. These variations can be regarded as ionospheric precursors and may be used for earthquake prediction.

  1. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    NASA Astrophysics Data System (ADS)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  2. Decision making biases in the communication of earthquake risk

    NASA Astrophysics Data System (ADS)

    Welsh, M. B.; Steacy, S.; Begg, S. H.; Navarro, D. J.

    2015-12-01

    L'Aquila, with 6 scientists convicted of manslaughter, shocked the scientific community, leading to urgent re-appraisal of communication methods for low-probability, high-impact events. Before the trial, a commission investigating the earthquake recommended risk assessment be formalised via operational earthquake forecasts and that social scientists be enlisted to assist in developing communication strategies. Psychological research has identified numerous decision biases relevant to this, including hindsight bias, where people (after the fact) overestimate an event's predictability. This affects experts as well as naïve participants as it relates to their ability to construct a plausible causal story rather than the likelihood of the event. Another problem is availability, which causes overestimation of the likelihood of observed rare events due to their greater noteworthiness. This, however, is complicated by the 'description-experience' gap, whereby people underestimate probabilities for events they have not experienced. That is, people who have experienced strong earthquakes judge them more likely while those who have not judge them less likely - relative to actual probabilities. Finally, format changes alter people's decisions. That is people treat '1 in 10,000' as different from 0.01% despite their mathematical equivalence. Such effects fall under the broad term framing, which describes how different framings of the same event alter decisions. In particular, people's attitude to risk depends significantly on how scenarios are described. We examine the effect of biases on the communication of change in risk. South Australian participants gave responses to scenarios describing familiar (bushfire) or unfamiliar (earthquake) risks. While bushfires are rare in specific locations, significant fire events occur each year and are extensively covered. By comparison, our study location (Adelaide) last had a M5 quake in 1954. Preliminary results suggest the description-experience

  3. A prototype of the procedure of strong ground motion prediction for intraslab earthquake based on characterized source model

    NASA Astrophysics Data System (ADS)

    Iwata, T.; Asano, K.; Sekiguchi, H.

    2011-12-01

    We propose a prototype of the procedure to construct source models for strong motion prediction during intraslab earthquakes based on the characterized source model (Irikura and Miyake, 2011). The key is the characterized source model which is based on the empirical scaling relationships for intraslab earthquakes and involve the correspondence between the SMGA (strong motion generation area, Miyake et al., 2003) and the asperity (large slip area). Iwata and Asano (2011) obtained the empirical relationships of the rupture area (S) and the total asperity area (Sa) to the seismic moment (Mo) as follows, with assuming power of 2/3 dependency of S and Sa on M0, S (km**2) = 6.57×10**(-11)×Mo**(2/3) (Nm) (1) Sa (km**2) = 1.04 ×10**(-11)×Mo**(2/3) (Nm) (2). Iwata and Asano (2011) also pointed out that the position and the size of SMGA approximately corresponds to the asperity area for several intraslab events. Based on the empirical relationships, we gave a procedure for constructing source models of intraslab earthquakes for strong motion prediction. [1] Give the seismic moment, Mo. [2] Obtain the total rupture area and the total asperity area according to the empirical scaling relationships between S, Sa, and Mo given by Iwata and Asano (2011). [3] Square rupture area and asperities are assumed. [4] The source mechanism is assumed to be the same as that of small events in the source region. [5] Plural scenarios including variety of the number of asperities and rupture starting points are prepared. We apply this procedure by simulating strong ground motions for several observed events for confirming the methodology.

  4. Continuous borehole strain and pore pressure in the near field of the 28 September 2004 M 6.0 parkfield, California, earthquake: Implications for nucleation, fault response, earthquake prediction and tremor

    USGS Publications Warehouse

    Johnston, M.J.S.; Borcherdt, R.D.; Linde, A.T.; Gladwin, M.T.

    2006-01-01

    Near-field observations of high-precision borehole strain and pore pressure, show no indication of coherent accelerating strain or pore pressure during the weeks to seconds before the 28 September 2004 M 6.0 Parkfield earthquake. Minor changes in strain rate did occur at a few sites during the last 24 hr before the earthquake but these changes are neither significant nor have the form expected for strain during slip coalescence initiating fault failure. Seconds before the event, strain is stable at the 10-11 level. Final prerupture nucleation slip in the hypocentral region is constrained to have a moment less than 2 ?? 1012 N m (M 2.2) and a source size less than 30 m. Ground displacement data indicate similar constraints. Localized rupture nucleation and runaway precludes useful prediction of damaging earthquakes. Coseismic dynamic strains of about 10 microstrain peak-to-peak were superimposed on volumetric strain offsets of about 0.5 microstrain to the northwest of the epicenter and about 0.2 microstrain to the southeast of the epicenter, consistent with right lateral slip. Observed strain and Global Positioning System (GPS) offsets can be simply fit with 20 cm of slip between 4 and 10 km on a 20-km segment of the fault north of Gold Hill (M0 = 7 ?? 1017 N m). Variable slip inversion models using GPS data and seismic data indicate similar moments. Observed postseismic strain is 60% to 300% of the coseismic strain, indicating incomplete release of accumulated strain. No measurable change in fault zone compliance preceding or following the earthquake is indicated by stable earth tidal response. No indications of strain change accompany nonvolcanic tremor events reported prior to and following the earthquake.

  5. National Earthquake Hazards Reduction Program; time to expand

    USGS Publications Warehouse

    Steinbrugge, K.V.

    1990-01-01

    All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role? 

  6. Earthquakes

    MedlinePlus

    ... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

  7. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  8. Ground Motions Due to Earthquakes on Creeping Faults

    NASA Astrophysics Data System (ADS)

    Harris, R.; Abrahamson, N. A.

    2014-12-01

    We investigate the peak ground motions from the largest well-recorded earthquakes on creeping strike-slip faults in active-tectonic continental regions. Our goal is to evaluate if the strong ground motions from earthquakes on creeping faults are smaller than the strong ground motions from earthquakes on locked faults. Smaller ground motions might be expected from earthquakes on creeping faults if the fault sections that strongly radiate energy are surrounded by patches of fault that predominantly absorb energy. For our study we used the ground motion data available in the PEER NGA-West2 database, and the ground motion prediction equations that were developed from the PEER NGA-West2 dataset. We analyzed data for the eleven largest well-recorded creeping-fault earthquakes, that ranged in magnitude from M5.0-6.5. Our findings are that these earthquakes produced peak ground motions that are statistically indistinguishable from the peak ground motions produced by similar-magnitude earthquakes on locked faults. These findings may be implemented in earthquake hazard estimates for moderate-size earthquakes in creeping-fault regions. Further investigation is necessary to determine if this result will also apply to larger earthquakes on creeping faults. Please also see: Harris, R.A., and N.A. Abrahamson (2014), Strong ground motions generated by earthquakes on creeping faults, Geophysical Research Letters, vol. 41, doi:10.1002/2014GL060228.

  9. A new algorithm to detect earthquakes outside the seismic network: preliminary results

    NASA Astrophysics Data System (ADS)

    Giudicepietro, Flora; Esposito, Antonietta Maria; Ricciolino, Patrizia

    2017-04-01

    In this text we are going to present a new technique for detecting earthquakes outside the seismic network, which are often the cause of fault of automatic analysis system. Our goal is to develop a robust method that provides the discrimination result as quickly as possible. We discriminate local earthquakes from regional earthquakes, both recorded at SGG station, equipped with short period sensors, operated by Osservatorio Vesuviano (INGV) in the Southern Apennines (Italy). The technique uses a Multi Layer Perceptron (MLP) neural network with an architecture composed by an input layer, a hidden layer and a single node output layer. We pre-processed the data using the Linear Predictive Coding (LPC) technique to extract the spectral features of the signals in a compact form. We performed several experiments by shortening the signal window length. In particular, we used windows of 4, 2 and 1 seconds containing the onset of the local and the regional earthquakes. We used a dataset of 103 local earthquakes and 79 regional earthquakes, most of which occurred in Greece, Albania and Crete. We split the dataset into a training set, for the network training, and a testing set to evaluate the network's capacity of discrimination. In order to assess the network stability, we repeated this procedure six times, randomly changing the data composition of the training and testing set and the initial weights of the net. We estimated the performance of this method by calculating the average of correct detection percentages obtained for each of the six permutations. The average performances are 99.02%, 98.04% and 98.53%, which concern respectively the experiments carried out on 4, 2 and 1 seconds signal windows. The results show that our method is able to recognize the earthquakes outside the seismic network using only the first second of the seismic records, with a suitable percentage of correct detection. Therefore, this algorithm can be profitably used to make earthquake automatic

  10. Prospective Validation of Pre-earthquake Atmospheric Signals and Their Potential for Short–term Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Ouzounov, Dimitar; Pulinets, Sergey; Hattori, Katsumi; Lee, Lou; Liu, Tiger; Kafatos, Menas

    2015-04-01

    We are presenting the latest development in multi-sensors observations of short-term pre-earthquake phenomena preceding major earthquakes. Our challenge question is: "Whether such pre-earthquake atmospheric/ionospheric signals are significant and could be useful for early warning of large earthquakes?" To check the predictive potential of atmospheric pre-earthquake signals we have started to validate anomalous ionospheric / atmospheric signals in retrospective and prospective modes. The integrated satellite and terrestrial framework (ISTF) is our method for validation and is based on a joint analysis of several physical and environmental parameters (Satellite thermal infrared radiation (STIR), electron concentration in the ionosphere (GPS/TEC), radon/ion activities, air temperature and seismicity patterns) that were found to be associated with earthquakes. The science rationale for multidisciplinary analysis is based on concept Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) [Pulinets and Ouzounov, 2011], which explains the synergy of different geospace processes and anomalous variations, usually named short-term pre-earthquake anomalies. Our validation processes consist in two steps: (1) A continuous retrospective analysis preformed over two different regions with high seismicity- Taiwan and Japan for 2003-2009 (2) Prospective testing of STIR anomalies with potential for M5.5+ events. The retrospective tests (100+ major earthquakes, M>5.9, Taiwan and Japan) show STIR anomalous behavior before all of these events with false negatives close to zero. False alarm ratio for false positives is less then 25%. The initial prospective testing for STIR shows systematic appearance of anomalies in advance (1-30 days) to the M5.5+ events for Taiwan, Kamchatka-Sakhalin (Russia) and Japan. Our initial prospective results suggest that our approach show a systematic appearance of atmospheric anomalies, one to several days prior to the largest earthquakes That feature could be

  11. Dynamic strains for earthquake source characterization

    USGS Publications Warehouse

    Barbour, Andrew J.; Crowell, Brendan W

    2017-01-01

    Strainmeters measure elastodynamic deformation associated with earthquakes over a broad frequency band, with detection characteristics that complement traditional instrumentation, but they are commonly used to study slow transient deformation along active faults and at subduction zones, for example. Here, we analyze dynamic strains at Plate Boundary Observatory (PBO) borehole strainmeters (BSM) associated with 146 local and regional earthquakes from 2004–2014, with magnitudes from M 4.5 to 7.2. We find that peak values in seismic strain can be predicted from a general regression against distance and magnitude, with improvements in accuracy gained by accounting for biases associated with site–station effects and source–path effects, the latter exhibiting the strongest influence on the regression coefficients. To account for the influence of these biases in a general way, we include crustal‐type classifications from the CRUST1.0 global velocity model, which demonstrates that high‐frequency strain data from the PBO BSM network carry information on crustal structure and fault mechanics: earthquakes nucleating offshore on the Blanco fracture zone, for example, generate consistently lower dynamic strains than earthquakes around the Sierra Nevada microplate and in the Salton trough. Finally, we test our dynamic strain prediction equations on the 2011 M 9 Tohoku‐Oki earthquake, specifically continuous strain records derived from triangulation of 137 high‐rate Global Navigation Satellite System Earth Observation Network stations in Japan. Moment magnitudes inferred from these data and the strain model are in agreement when Global Positioning System subnetworks are unaffected by spatial aliasing.

  12. Brady's Geothermal Field DAS Earthquake Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurt Feigl

    The submitted data correspond to the vibration caused by a 3.4 M earthquake and captured by the DAS horizontal and vertical arrays during the PoroTomo Experiment. Earthquake information : M 4.3 - 23km ESE of Hawthorne, Nevada Time: 2016-03-21 07:37:10 (UTC) Location: 38.479 N 118.366 W Depth: 9.9 km

  13. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

    PubMed

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

  14. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    PubMed Central

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  15. Probing failure susceptibilities of earthquake faults using small-quake tidal correlations.

    PubMed

    Brinkman, Braden A W; LeBlanc, Michael; Ben-Zion, Yehuda; Uhl, Jonathan T; Dahmen, Karin A

    2015-01-27

    Mitigating the devastating economic and humanitarian impact of large earthquakes requires signals for forecasting seismic events. Daily tide stresses were previously thought to be insufficient for use as such a signal. Recently, however, they have been found to correlate significantly with small earthquakes, just before large earthquakes occur. Here we present a simple earthquake model to investigate whether correlations between daily tidal stresses and small earthquakes provide information about the likelihood of impending large earthquakes. The model predicts that intervals of significant correlations between small earthquakes and ongoing low-amplitude periodic stresses indicate increased fault susceptibility to large earthquake generation. The results agree with the recent observations of large earthquakes preceded by time periods of significant correlations between smaller events and daily tide stresses. We anticipate that incorporating experimentally determined parameters and fault-specific details into the model may provide new tools for extracting improved probabilities of impending large earthquakes.

  16. Lessons of L'Aquila for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  17. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  18. Management of limb fractures in a teaching hospital: comparison between Wenchuan and Yushu earthquakes.

    PubMed

    Min, Li; Tu, Chong-qi; Liu, Lei; Zhang, Wen-li; Yi, Min; Song, Yue-ming; Huang, Fu-guo; Yang, Tian-fu; Pei, Fu-xing

    2013-01-01

    To comparatively analyze the medical records of patients with limb fractures as well as rescue strategy in Wenchuan and Yushu earthquakes so as to provide references for post-earthquake rescue. We retrospectively investigated 944 patients sustaining limb fractures, including 891 in Wenchuan earthquake and 53 in Yushu earthquake, who were admitted to West China Hospital (WCH) of Sichuan University. In Wenchuan earthquake, WCH met its three peaks of limb fracture patients influx, on post-earthquake day (PED) 2, 8 and 14 respectively. Between PED 3-14, 585 patients were transferred from WCH to other hospitals outside the Sichuan Province. In Yushu earthquake, the maximum influx of limb fracture patients happened on PED 3, and no one was shifted to other hospitals. Both in Wenchuan and Yushu earthquakes, most limb fractures were caused by blunt strike and crush/burying. In Wenchuan earthquake, there were 396 (396/942, 42.0%) open limb fractures, including 28 Gustilo I, 201 Gustilo II and 167 Gustilo III injuries. But in Yushu earthquake, the incidence of open limb fracture was much lower (6/61, 9.8%). The percent of patients with acute complications in Wenchuan earthquake (167/891, 18.7%) was much higher than that in Yushu earthquake (5/53, 3.8%). In Wenchuan earthquake rescue, 1 018 surgeries were done, composed of debridement in 376, internal fixation in 283, external fixation in 119, and vacuum sealing drainage in 117, etc. While among the 64 surgeries in Yushu earthquake rescue, the internal fixation for limb fracture was mostly adopted. All patients received proper treatment and survived except one who died due to multiple organs failure in Wenchuan earthquake. Provision of suitable and sufficient medical care in a catastrophe can only be achieved by construction of sophisticated national disaster medical system, prediction of the injury types and number of injuries, and confirmation of participating hospitals?exact role. Based on the valuable rescue experiences

  19. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  20. Fault lubrication during earthquakes.

    PubMed

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  1. The 2015 Illapel earthquake, central Chile: A type case for a characteristic earthquake?

    NASA Astrophysics Data System (ADS)

    Tilmann, F.; Zhang, Y.; Moreno, M.; Saul, J.; Eckelmann, F.; Palo, M.; Deng, Z.; Babeyko, A.; Chen, K.; Baez, J. C.; Schurr, B.; Wang, R.; Dahm, T.

    2016-01-01

    On 16 September 2015, the MW = 8.2 Illapel megathrust earthquake ruptured the Central Chilean margin. Combining inversions of displacement measurements and seismic waveforms with high frequency (HF) teleseismic backprojection, we derive a comprehensive description of the rupture, which also predicts deep ocean tsunami wave heights. We further determine moment tensors and obtain accurate depth estimates for the aftershock sequence. The earthquake nucleated near the coast but then propagated to the north and updip, attaining a peak slip of 5-6 m. In contrast, HF seismic radiation is mostly emitted downdip of the region of intense slip and arrests earlier than the long period rupture, indicating smooth slip along the shallow plate interface in the final phase. A superficially similar earthquake in 1943 with a similar aftershock zone had a much shorter source time function, which matches the duration of HF seismic radiation in the recent event, indicating that the 1943 event lacked the shallow slip.

  2. Modeling fast and slow earthquakes at various scales

    PubMed Central

    IDE, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes. PMID:25311138

  3. Modeling fast and slow earthquakes at various scales.

    PubMed

    Ide, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes.

  4. Update of the Graizer-Kalkan ground-motion prediction equations for shallow crustal continental earthquakes

    USGS Publications Warehouse

    Graizer, Vladimir; Kalkan, Erol

    2015-01-01

    A ground-motion prediction equation (GMPE) for computing medians and standard deviations of peak ground acceleration and 5-percent damped pseudo spectral acceleration response ordinates of maximum horizontal component of randomly oriented ground motions was developed by Graizer and Kalkan (2007, 2009) to be used for seismic hazard analyses and engineering applications. This GMPE was derived from the greatly expanded Next Generation of Attenuation (NGA)-West1 database. In this study, Graizer and Kalkan’s GMPE is revised to include (1) an anelastic attenuation term as a function of quality factor (Q0) in order to capture regional differences in large-distance attenuation and (2) a new frequency-dependent sedimentary-basin scaling term as a function of depth to the 1.5-km/s shear-wave velocity isosurface to improve ground-motion predictions for sites on deep sedimentary basins. The new model (GK15), developed to be simple, is applicable to the western United States and other regions with shallow continental crust in active tectonic environments and may be used for earthquakes with moment magnitudes 5.0–8.0, distances 0–250 km, average shear-wave velocities 200–1,300 m/s, and spectral periods 0.01–5 s. Directivity effects are not explicitly modeled but are included through the variability of the data. Our aleatory variability model captures inter-event variability, which decreases with magnitude and increases with distance. The mixed-effects residuals analysis shows that the GK15 reveals no trend with respect to the independent parameters. The GK15 is a significant improvement over Graizer and Kalkan (2007, 2009), and provides a demonstrable, reliable description of ground-motion amplitudes recorded from shallow crustal earthquakes in active tectonic regions over a wide range of magnitudes, distances, and site conditions.

  5. A grounded theory study of 'turning into a strong nurse': Earthquake experiences and perspectives on disaster nursing education.

    PubMed

    Li, Yan; Turale, Sue; Stone, Teresa E; Petrini, Marcia

    2015-09-01

    While Asia has the dubious distinction of being the world's most natural disaster-prone area, disaster nursing education and training are sparse in many Asian countries, especially China where this study took place. To explore the earthquake disaster experiences of Chinese nurses and develop a substantive theory of earthquake disaster nursing that will help inform future development of disaster nursing education. A qualitative study employing grounded theory, informed by symbolic interactionism. Fifteen Chinese registered nurses from five hospitals in Jiangxi Province who undertook relief efforts after the 2008 Wenchuan Earthquake. Data were collected in 2012-2013 in digitally-recorded, semi-structured, in-depth interviews and reflective field notes, and analyzed using Glaser's grounded theory method. Participants were unprepared educationally and psychologically for their disaster work. Supporting the emergent theory of "working in that terrible environment", was the core category of "turning into a strong nurse", a process of three stages: "going to the disaster"; "immersing in the disaster"; and "trying to let disaster experiences fade away". The participants found themselves thrust in "terrible" scenes of destruction, experienced personal dangers and ethical dilemmas, and tried the best they could to help survivors, communities and themselves, with limited resources and confronting professional work. Our rich findings confirm those of other studies in China and elsewhere, that attention must be paid to disaster education and training for nurses, as well as the mental health of nurses who work in disaster areas. Emergent theory helps to inform nurse educators, researchers, leaders and policy makers in China, and elsewhere in developing strategies to better prepare nurses for future disasters, and assist communities to prepare for and recover after earthquake disasters. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Earthquake Rate Model 2.2 of the 2007 Working Group for California Earthquake Probabilities, Appendix D: Magnitude-Area Relationships

    USGS Publications Warehouse

    Stein, Ross S.

    2007-01-01

    Summary To estimate the down-dip coseismic fault dimension, W, the Executive Committee has chosen the Nazareth and Hauksson (2004) method, which uses the 99% depth of background seismicity to assign W. For the predicted earthquake magnitude-fault area scaling used to estimate the maximum magnitude of an earthquake rupture from a fault's length, L, and W, the Committee has assigned equal weight to the Ellsworth B (Working Group on California Earthquake Probabilities, 2003) and Hanks and Bakun (2002) (as updated in 2007) equations. The former uses a single relation; the latter uses a bilinear relation which changes slope at M=6.65 (A=537 km2).

  7. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    . The usable and realistic ground motion maps for urban areas are generated: - either from the assumption of a "reference earthquake" - or directly, showing values of macroseimic intensity generated by a damaging, real earthquake. In the study, applying deterministic approach, earthquake scenario in macroseismic intensity ("model" earthquake scenario) for the city of Sofia is generated. The deterministic "model" intensity scenario based on assumption of a "reference earthquake" is compared with a scenario based on observed macroseimic effects caused by the damaging 2012 earthquake (MW5.6). The difference between observed (Io) and predicted (Ip) intensities values is analyzed.

  8. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    NASA Astrophysics Data System (ADS)

    Lee, Kyungbook; Song, Seok Goo

    2017-09-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  9. How personal earthquake experience impacts on the Stroop interference effect: an event-related potential study.

    PubMed

    Qiu, Jiang; Su, Yanhua; Li, Hong; Wei, Dongtao; Tu, Shen; Zhang, Qinglin

    2010-11-01

    Event-related brain potentials (ERPs) were measured when 24 Chinese subjects performed the classical Stroop task. All of subjects had experienced the great Sichuan earthquake (5/12), with 12 people in each of the Far (Chengdu city) and the Close (Deyang city) earthquake experience groups. The behavioral data showed that the Stroop task yielded a robust Stroop interference effect as indexed by longer RT for incongruent than congruent color words in both the Chengdu and Deyang groups. Scalp ERP data showed that incongruent stimuli elicited a more negative ERP deflection (N400-600; Stroop interference effect) than did congruent stimuli between 400-600 ms in the Chengdu group, while the Stroop interference ERP effect was not found in the Deyang group. Dipole source analysis localized the generator of the N400-600 in the right prefrontal cortex (PFC) and was possibly related to conflict monitoring and cognitive control. Copyright © 2010 Society for Psychophysiological Research.

  10. Distant, delayed and ancient earthquake-induced landslides

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Torgoev, Almaz; Braun, Anika; Schlögel, Romy; Micu, Mihai

    2016-04-01

    On the basis of a new classification of seismically induced landslides we outline particular effects related to the delayed and distant triggering of landslides. Those cannot be predicted by state-of-the-art methods. First, for about a dozen events the 'predicted' extension of the affected area is clearly underestimated. The most problematic cases are those for which far-distant triggering of landslides had been reported, such as for the 1988 Saguenay earthquake. In Central Asia reports for such cases are known for areas marked by a thick cover of loess. One possible contributing effect could be a low-frequency resonance of the thick soils induced by distant earthquakes, especially those in the Pamir - Hindu Kush seismic region. Such deep focal and high magnitude (>>7) earthquakes are also found in Europe, first of all in the Vrancea region (Romania). For this area and others in Central Asia we computed landslide event sizes related to scenario earthquakes with M>7.5. The second particular and challenging type of triggering is the one delayed with respect to the main earthquake event: case histories have been reported for the Racha earthquake in 1991 when several larger landslides only started moving 2 or 3 days after the main shock. Similar observations were also made after other earthquake events in the U.S., such as after the 1906 San Francisco, the 1949 Tacoma, the 1959 Hebgen Lake and the 1983 Bora Peak earthquakes. Here, we will present a series of detailed examples of (partly monitored) mass movements in Central Asia that mainly developed after earthquakes, some even several weeks after the main shock: e.g. the Tektonik and Kainama landslides triggered in 1992 and 2004, respectively. We believe that the development of the massive failures is a consequence of the opening of tension cracks during the seismic shaking and their filling up with water during precipitations that followed the earthquakes. The third particular aspect analysed here is the use of large

  11. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes

    PubMed Central

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    Background A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities’ preparedness and response capabilities and to mitigate future consequences. Methods An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model’s algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. Results the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. Conclusion The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties. PMID:26959647

  12. Electromagnetic earthquake triggering phenomena: State-of-the-art research and future developments

    NASA Astrophysics Data System (ADS)

    Zeigarnik, Vladimir; Novikov, Victor

    2014-05-01

    Developed in the 70s of the last century in Russia unique pulsed power systems based on solid propellant magneto-hydrodynamic (MHD) generators with an output of 10-500 MW and operation duration of 10 to 15 s were applied for an active electromagnetic monitoring of the Earth's crust to explore its deep structure, oil and gas electrical prospecting, and geophysical studies for earthquake prediction due to their high specific power parameters, portability, and a capability of operation under harsh climatic conditions. The most interesting and promising results were obtained during geophysical experiments at the test sites located at Pamir and Northern Tien Shan mountains, when after 1.5-2.5 kA electric current injection into the Earth crust through an 4 km-length emitting dipole the regional seismicity variations were observed (increase of number of weak earthquakes within a week). Laboratory experiments performed by different teams of the Institute of Physics of the Earth, Joint Institute for High Temperatures, and Research Station of Russian Academy of Sciences on observation of acoustic emission behavior of stressed rock samples during their processing by electric pulses demonstrated similar patterns - a burst of acoustic emission (formation of cracks) after application of current pulse to the sample. Based on the field and laboratory studies it was supposed that a new kind of earthquake triggering - electromagnetic initiation of weak seismic events has been observed, which may be used for the man-made electromagnetic safe release of accumulated tectonic stresses and, consequently, for earthquake hazard mitigation. For verification of this hypothesis some additional field experiments were carried out at the Bishkek geodynamic proving ground with application of pulsed ERGU-600 facility, which provides 600 A electric current in the emitting dipole. An analysis of spatio-temporal redistribution of weak regional seismicity after ERGU-600 pulses, as well as a response

  13. Emergency surgical care delivery in post-earthquake Haiti: Partners in Health and Zanmi Lasante experience.

    PubMed

    McIntyre, Thomas; Hughes, Christopher D; Pauyo, Thierry; Sullivan, Stephen R; Rogers, Selwyn O; Raymonville, Maxi; Meara, John G

    2011-04-01

    The earthquake that struck Haiti on 12 January 2010 caused significant devastation to both the country and the existing healthcare infrastructure in both urban and rural areas. Most hospital and health care facilities in Port-au-Prince and the surrounding areas were significantly damaged or destroyed. Consequently, large groups of Haitians fled Port-au-Prince for rural areas to seek emergency medical and surgical care. In partnership with the Haitian Ministry of Health, Partners in Health (PIH) and Zanmi Lasante (ZL) have developed and maintained a network of regional and district hospitals in rural Haiti for over twenty-five years. This PIH/ZL system was ideally situated to accommodate the increased need for emergent surgical care in the immediate quake aftermath. The goal of the present study was to provide a cross-sectional assessment of surgical need and care delivery across PIH/ZL facilities after the earthquake in Haiti. We conducted a retrospective review of hospital case logs and operative records over the course of three weeks immediately following the earthquake. Roughly 3,000 patients were seen at PIH/ZL sites by a combination of Haitian and international surgical teams. During that period 513 emergency surgical cases were logged. Other than wound debridement, the most commonly performed procedure was fixation of long bone fractures, which constituted approximately one third of all surgical procedures. There was a significant demand for emergent surgical care after the earthquake in Haiti. The PIH/ZL hospital system played a critical role in addressing this acutely increased burden of surgical disease, and it allowed for large numbers of Haitians to receive needed surgical services. Our experiences reinforce that access to essential surgery is an essential pillar in public health.

  14. Haiti and the Earthquake: Examining the Experience of Psychological Stress and Trauma

    ERIC Educational Resources Information Center

    Risler, Ed; Kintzle, Sara; Nackerud, Larry

    2015-01-01

    For approximately 35 seconds on January 10, 2010, an earthquake measuring 7.0 on the Richter scale struck the small Caribbean nation of Haiti. This research used a preexperimental one-shot posttest to examine the incidence of posttraumatic stress disorder (PTSD) and associated trauma symptomatology from the earthquake experienced by a sample of…

  15. Classification of Earthquake-triggered Landslide Events - Review of Classical and Particular Cases

    NASA Astrophysics Data System (ADS)

    Braun, A.; Havenith, H. B.; Schlögel, R.

    2016-12-01

    Seismically induced landslides often contribute to a significant degree to the losses related to earthquakes. The identification of possible extends of landslide affected areas can help to target emergency measures when an earthquake occurs or improve the resilience of inhabited areas and critical infrastructure in zones of high seismic hazard. Moreover, landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes in paleoseismic studies, allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. Inspired by classical reviews of earthquake induced landslides, e.g. by Keefer or Jibson, we present here a review of factors contributing to earthquake triggered slope failures based on an `event-by-event' classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, `Intensity', `Fault', `Topographic energy', `Climatic conditions' and `Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be crosschecked. We present cases where our prediction model performs well and discuss particular cases

  16. The 26 January 2001 M 7.6 Bhuj, India, earthquake: Observed and predicted ground motions

    USGS Publications Warehouse

    Hough, S.E.; Martin, S.; Bilham, R.; Atkinson, G.M.

    2002-01-01

    Although local and regional instrumental recordings of the devastating 26, January 2001, Bhuj earthquake are sparse, the distribution of macroseismic effects can provide important constraints on the mainshock ground motions. We compiled available news accounts describing damage and other effects and interpreted them to obtain modified Mercalli intensities (MMIs) at >200 locations throughout the Indian subcontinent. These values are then used to map the intensity distribution throughout the subcontinent using a simple mathematical interpolation method. Although preliminary, the maps reveal several interesting features. Within the Kachchh region, the most heavily damaged villages are concentrated toward the western edge of the inferred fault, consistent with western directivity. Significant sediment-induced amplification is also suggested at a number of locations around the Gulf of Kachchh to the south of the epicenter. Away from the Kachchh region, intensities were clearly amplified significantly in areas that are along rivers, within deltas, or on coastal alluvium, such as mudflats and salt pans. In addition, we use fault-rupture parameters inferred from teleseismic data to predict shaking intensity at distances of 0-1000 km. We then convert the predicted hard-rock ground-motion parameters to MMI by using a relationship (derived from Internet-based intensity surveys) that assigns MMI based on the average effects in a region. The predicted MMIs are typically lower by 1-3 units than those estimated from news accounts, although they do predict near-field ground motions of approximately 80%g and potentially damaging ground motions on hard-rock sites to distances of approximately 300 km. For the most part, this discrepancy is consistent with the expected effect of sediment response, but it could also reflect other factors, such as unusually high building vulnerability in the Bhuj region and a tendency for media accounts to focus on the most dramatic damage, rather than

  17. Earthquake-triggered liquefaction in Southern Siberia and surroundings: a base for predictive models and seismic hazard estimation

    NASA Astrophysics Data System (ADS)

    Lunina, Oksana

    2016-04-01

    The forms and location patterns of soil liquefaction induced by earthquakes in southern Siberia, Mongolia, and northern Kazakhstan in 1950 through 2014 have been investigated, using field methods and a database of coseismic effects created as a GIS MapInfo application, with a handy input box for large data arrays. Statistical analysis of the data has revealed regional relationships between the magnitude (Ms) of an earthquake and the maximum distance of its environmental effect to the epicenter and to the causative fault (Lunina et al., 2014). Estimated limit distances to the fault for the Ms = 8.1 largest event are 130 km that is 3.5 times as short as those to the epicenter, which is 450 km. Along with this the wider of the fault the less liquefaction cases happen. 93% of them are within 40 km from the causative fault. Analysis of liquefaction locations relative to nearest faults in southern East Siberia shows the distances to be within 8 km but 69% of all cases are within 1 km. As a result, predictive models have been created for locations of seismic liquefaction, assuming a fault pattern for some parts of the Baikal rift zone. Base on our field and world data, equations have been suggested to relate the maximum sizes of liquefaction-induced clastic dikes (maximum width, visible maximum height and intensity index of clastic dikes) with Ms and local shaking intensity corresponding to the MSK-64 macroseismic intensity scale (Lunina and Gladkov, 2015). The obtained results make basis for modeling the distribution of the geohazard for the purposes of prediction and for estimating the earthquake parameters from liquefaction-induced clastic dikes. The author would like to express their gratitude to the Institute of the Earth's Crust, Siberian Branch of the Russian Academy of Sciences for providing laboratory to carry out this research and Russian Scientific Foundation for their financial support (Grant 14-17-00007).

  18. Complex earthquake rupture and local tsunamis

    USGS Publications Warehouse

    Geist, E.L.

    2002-01-01

    In contrast to far-field tsunami amplitudes that are fairly well predicted by the seismic moment of subduction zone earthquakes, there exists significant variation in the scaling of local tsunami amplitude with respect to seismic moment. From a global catalog of tsunami runup observations this variability is greatest for the most frequently occuring tsunamigenic subduction zone earthquakes in the magnitude range of 7 < Mw < 8.5. Variability in local tsunami runup scaling can be ascribed to tsunami source parameters that are independent of seismic moment: variations in the water depth in the source region, the combination of higher slip and lower shear modulus at shallow depth, and rupture complexity in the form of heterogeneous slip distribution patterns. The focus of this study is on the effect that rupture complexity has on the local tsunami wave field. A wide range of slip distribution patterns are generated using a stochastic, self-affine source model that is consistent with the falloff of far-field seismic displacement spectra at high frequencies. The synthetic slip distributions generated by the stochastic source model are discretized and the vertical displacement fields from point source elastic dislocation expressions are superimposed to compute the coseismic vertical displacement field. For shallow subduction zone earthquakes it is demonstrated that self-affine irregularities of the slip distribution result in significant variations in local tsunami amplitude. The effects of rupture complexity are less pronounced for earthquakes at greater depth or along faults with steep dip angles. For a test region along the Pacific coast of central Mexico, peak nearshore tsunami amplitude is calculated for a large number (N = 100) of synthetic slip distribution patterns, all with identical seismic moment (Mw = 8.1). Analysis of the results indicates that for earthquakes of a fixed location, geometry, and seismic moment, peak nearshore tsunami amplitude can vary by a

  19. Surface Rupture Effects on Earthquake Moment-Area Scaling Relations

    NASA Astrophysics Data System (ADS)

    Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro

    2017-09-01

    Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.

  20. Investigation of intraplate seismicity near the sites of the 2012 major strike-slip earthquakes in the eastern Indian Ocean through a passive-source OBS experiment

    NASA Astrophysics Data System (ADS)

    Guo, L.; Lin, J.; Yang, H.

    2017-12-01

    The 11 April 2012 Mw8.6 earthquake off the coast of Sumatra in the eastern Indian Ocean was the largest strike-slip earthquake ever recorded. The 2012 mainshock and its aftershock sequences were associated with complex slip partitioning and earthquake interactions of an oblique convergent system, in a new plate boundary zone between the Indian and Australian plates. The detail processes of the earthquake interactions and correlation with seafloor geological structure, however, are still poorly known. During March-April 2017, an array of broadband OBS (ocean bottom seismometer) were deployed, for the first time, near the epicenter region of the 2012 earthquake sequence. During post-expedition data processing, we identified 70 global earthquakes from the National Earthquake Information Center (NEIC) catalog that occurred during our OBS deployment period. We then picked P and S waves in the seismic records and analyzed their arrival times. We further identified and analyzed multiple local earthquakes and examined their relationship to the observed seafloor structure (fracture zones, seafloor faults, etc.) and the state of stresses in this region of the eastern Indian Ocean. The ongoing analyses of the data obtained from this unique seismic experiment are expected to provide important constraints on the large-scale intraplate deformation in this part of the eastern Indian Ocean.

  1. [Comment on Earthquake precursors: Banished forever?] Comment: Unpredictability of earthquakes-Truth or fiction?

    NASA Astrophysics Data System (ADS)

    Lomnitz, Cinna

    I was delighted to read Alexander Gusev's opinions on what he calls the “unpredictability paradigm” of earthquakes (Eos, February 10, 1998, p. 71). I always enjoy hearing from a good friend in the pages of Eos. I immediately looked up “paradigm” in my Oxford Dictionary and found this: paradigm n 1) set of all the different forms of a word: verb paradigms. 2) Type of something; pattern; model: a paradigm for others to copy.I wonder whether Sasha Gusev actually believes that branding earthquake prediction a “proven nonscience” [Geller, 1997] is a paradigm for others to copy. As for me, I choose to refrain from climbing on board this particular bandwagon for the following reasons.

  2. Do weak global stresses synchronize earthquakes?

    NASA Astrophysics Data System (ADS)

    Bendick, R.; Bilham, R.

    2017-08-01

    Insofar as slip in an earthquake is related to the strain accumulated near a fault since a previous earthquake, and this process repeats many times, the earthquake cycle approximates an autonomous oscillator. Its asymmetric slow accumulation of strain and rapid release is quite unlike the harmonic motion of a pendulum and need not be time predictable, but still resembles a class of repeating systems known as integrate-and-fire oscillators, whose behavior has been shown to demonstrate a remarkable ability to synchronize to either external or self-organized forcing. Given sufficient time and even very weak physical coupling, the phases of sets of such oscillators, with similar though not necessarily identical period, approach each other. Topological and time series analyses presented here demonstrate that earthquakes worldwide show evidence of such synchronization. Though numerous studies demonstrate that the composite temporal distribution of major earthquakes in the instrumental record is indistinguishable from random, the additional consideration of event renewal interval serves to identify earthquake groupings suggestive of synchronization that are absent in synthetic catalogs. We envisage the weak forces responsible for clustering originate from lithospheric strain induced by seismicity itself, by finite strains over teleseismic distances, or by other sources of lithospheric loading such as Earth's variable rotation. For example, quasi-periodic maxima in rotational deceleration are accompanied by increased global seismicity at multidecadal intervals.

  3. A hypothesis for delayed dynamic earthquake triggering

    USGS Publications Warehouse

    Parsons, T.

    2005-01-01

    It's uncertain whether more near-field earthquakes are triggered by static or dynamic stress changes. This ratio matters because static earthquake interactions are increasingly incorporated into probabilistic forecasts. Recent studies were unable to demonstrate all predictions from the static-stress-change hypothesis, particularly seismicity rate reductions. However, current dynamic stress change hypotheses do not explain delayed earthquake triggering and Omori's law. Here I show numerically that if seismic waves can alter some frictional contacts in neighboring fault zones, then dynamic triggering might cause delayed triggering and an Omori-law response. The hypothesis depends on faults following a rate/state friction law, and on seismic waves changing the mean critical slip distance (Dc) at nucleation zones.

  4. Co-Seismic Gravity Gradient Changes of the 2006-2007 Great Earthquakes in the Central Kuril Islands from GRACE Observations

    NASA Astrophysics Data System (ADS)

    Rahimi, A.; Shahrisvand, M.

    2017-09-01

    GRACE satellites (the Gravity Recovery And climate Experiment) are very useful sensors to extract gravity anomalies after earthquakes. In this study, we reveal co-seismic signals of the two combined earthquakes, the 2006 Mw8.3 thrust and 2007 Mw8.1 normal fault earthquakes of the central Kuril Islands from GRACE observations. We compute monthly full gravitational gradient tensor in the local north-east-down frame for Kuril Islands earthquakes without spatial averaging and de-striping filters. Some of gravitational gradient components (e.g. ΔVxx, ΔVxz) enhance high frequency components of the earth gravity field and reveal more details in spatial and temporal domain. Therefore, co-seismic activity can be better illustrated. For the first time, we show that the positive-negative-positive co-seismic ΔVxx due to the Kuril Islands earthquakes ranges from - 0.13 to + 0.11 milli Eötvös, and ΔVxz shows a positive-negative-positive pattern ranges from - 0.16 to + 0.13 milli Eötvös, agree well with seismic model predictions.

  5. Protecting Your Family From Earthquakes-The Seven Steps to Earthquake Safety (in Spanish and English)

    USGS Publications Warehouse

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here to share an important message on emergency preparedness. Historically, we have suffered earthquakes here in the San Francisco Bay Area that have caused severe hardship for residents and incredible damage to our cities. It is likely we will experience a severe earthquake within the next 30 years. Many of us come from other countries where we have experienced earth- quakes, so we believe that we understand them. However, the way we prepare for earthquakes in our home country may be different from the way it is necessary to prepare for earthquakes here. Very f w people die from collapsing buildings in the Bay Area because most structures are built to stand up to the shaking. But it is quite possible that your family will be without medical care or grocery stores and separated from one another for several days to weeks. It will ultimately be up to you to keep your family safe until help arrives, so we are asking you to join us in learning to take care of your family before, during, and after an earthquake. The first step is to read this book. Everyone in your family, children and adults, can learn how to prepare for an earthquake. Then take advantage of the American Red Cross Earthquake Preparedness training courses offered in your community. These preparedness courses are free, and also offered in Spanish and available to everyone in the community regardless of family history, leg al status, gender, or age. We encourage you to take one of these free training workshops. Look on the back cover for more information. Remember that an earthquake can occur without warning, and the only way that we can reduce the harm caused by earthquakes is to be prepared. Get Prepared!

  6. Remote monitoring of the earthquake cycle using satellite radar interferometry.

    PubMed

    Wright, Tim J

    2002-12-15

    The earthquake cycle is poorly understood. Earthquakes continue to occur on previously unrecognized faults. Earthquake prediction seems impossible. These remain the facts despite nearly 100 years of intensive study since the earthquake cycle was first conceptualized. Using data acquired from satellites in orbit 800 km above the Earth, a new technique, radar interferometry (InSAR), has the potential to solve these problems. For the first time, detailed maps of the warping of the Earth's surface during the earthquake cycle can be obtained with a spatial resolution of a few tens of metres and a precision of a few millimetres. InSAR does not need equipment on the ground or expensive field campaigns, so it can gather crucial data on earthquakes and the seismic cycle from some of the remotest areas of the planet. In this article, I review some of the remarkable observations of the earthquake cycle already made using radar interferometry and speculate on breakthroughs that are tantalizingly close.

  7. Finite Source Inversion for Laboratory Earthquakes

    NASA Astrophysics Data System (ADS)

    Parker, J. M.; Glaser, S. D.

    2017-12-01

    We produce finite source inversion results for laboratory earthquakes (LEQ) in PMMA confirmed by video recording of the fault contact. The LEQs are generated under highly controlled laboratory conditions and recorded by an array of absolutely calibrated acoustic emissions (AE) sensors. Following the method of Hartzell and Heaton (1983), we develop a solution using only the single-component AE sensors common in laboratory experiments. A set of calibration tests using glass capillary sources of varying size resolves the material characteristics and synthetic Green's Functions such that uncertainty in source location is reduced to 3σ<1mm; typical source radii are 1mm. Well-isolated events with corner frequencies on the order of 0.1 MHz (Mw -6) are recorded at 20 MHz and initially band-pass filtered from 0.1 to 1.0 MHz; in comparison, large earthquakes with corner frequencies around 0.1 Hz are commonly filtered from 0.1 to 1.0 Hz. We compare results of the inversion and video recording to slip distribution predicted by the Cattaneo partial slip asperity and numerical modeling. Not all asperities are large enough to resolve individually so some results must be interpreted as the smoothed effects of clusters of tiny contacts. For large asperities, partial slip is observed originating at the asperity edges and moving inward as predicted by the theory. Furthermore, expanding shear rupture fronts are observed as they reach resistive patches of asperities and halt or continue, depending on the relative energies of rupture and resistance.

  8. Insights into earthquake hazard map performance from shaking history simulations

    NASA Astrophysics Data System (ADS)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher

  9. Application and analysis of debris-flow early warning system in Wenchuan earthquake-affected area

    NASA Astrophysics Data System (ADS)

    Liu, D. L.; Zhang, S. J.; Yang, H. J.; Zhao, L. Q.; Jiang, Y. H.; Tang, D.; Leng, X. P.

    2016-02-01

    The activities of debris flow (DF) in the Wenchuan earthquake-affected area significantly increased after the earthquake on 12 May 2008. The safety of the lives and property of local people is threatened by DFs. A physics-based early warning system (EWS) for DF forecasting was developed and applied in this earthquake area. This paper introduces an application of the system in the Wenchuan earthquake-affected area and analyzes the prediction results via a comparison to the DF events triggered by the strong rainfall events reported by the local government. The prediction accuracy and efficiency was first compared with a contribution-factor-based system currently used by the weather bureau of Sichuan province. The storm on 17 August 2012 was used as a case study for this comparison. The comparison shows that the false negative rate and false positive rate of the new system is, respectively, 19 and 21 % lower than the system based on the contribution factors. Consequently, the prediction accuracy is obviously higher than the system based on the contribution factors with a higher operational efficiency. On the invitation of the weather bureau of Sichuan province, the authors upgraded their prediction system of DF by using this new system before the monsoon of Wenchuan earthquake-affected area in 2013. Two prediction cases on 9 July 2013 and 10 July 2014 were chosen to further demonstrate that the new EWS has high stability, efficiency, and prediction accuracy.

  10. Did you feel it? : citizens contribute to earthquake science

    USGS Publications Warehouse

    Wald, David J.; Dewey, James W.

    2005-01-01

    Since the early 1990s, the magnitude and location of an earthquake have been available within minutes on the Internet. Now, as a result of work by the U.S. Geological Survey and with the cooperation of various regional seismic networks, people who experience an earthquake can go online and share information about its effects to help create a map of shaking intensities and damage. Such “Community Internet Intensity Maps” (CIIMs) contribute greatly toward the quick assessment of the scope of an earthquake emergency and provide valuable data for earthquake research.

  11. On the Prediction of Ground Motion

    NASA Astrophysics Data System (ADS)

    Lavallee, D.; Schmedes, J.; Archuleta, R. J.

    2012-12-01

    Using a slip-weakening dynamic model of rupture, we generated earthquake scenarios that provided the spatio-temporal evolution of the slip on the fault and the radiated field at the free surface. We observed scenarios where the rupture propagates at a supershear speed on some parts of the fault while remaining subshear for other parts of the fault. For some scenarios with nearly identical initial conditions, the rupture speed was always subshear. For both types of scenarios (mixture of supershear and subshear speeds and only subshear), we compute the peak ground accelerations (PGA) regularly distributed over the Earth's surface. We then calculate the probability density functions (PDF) of the PGA. For both types of scenarios, the PDF curves are asymmetrically shaped and asymptotically attenuated according to power law. This behavior of the PDF is similar to that observed for the PDF curves of PGA recorded during earthquakes. The main difference between scenarios with a supershear rupture speed and scenarios with only subshear rupture speed is the range of PGA values. Based on these results, we investigate three issues fundamental for the prediction of ground motion. It is important to recognize that recorded ground motions during an earthquake sample a small fraction of the radiation field. It is not obvious that such sampling will capture the largest ground motion generated during an earthquake, nor that the number of stations is large enough to properly infer the statistical properties associated with the radiation field. To quantify the effect of under (or low) sampling of the radiation field, we design three experiments. For a scenario where the rupture speed is only subshear, we construct multiple sets of observations. Each set is comprised of 100 randomly selected PGA values from all of the PGA's calculated at the Earth's surface. In the first experiment, we evaluate how the distributions of PGA in the sets compare with the distribution of all the PGA. For

  12. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  13. Numerical Modeling and Forecasting of Strong Sumatra Earthquakes

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Yin, C.

    2007-12-01

    ESyS-Crustal, a finite element based computational model and software has been developed and applied to simulate the complex nonlinear interacting fault systems with the goal to accurately predict earthquakes and tsunami generation. With the available tectonic setting and GPS data around the Sumatra region, the simulation results using the developed software have clearly indicated that the shallow part of the subduction zone in the Sumatra region between latitude 6S and 2N has been locked for a long time, and remained locked even after the Northern part of the zone underwent a major slip event resulting into the infamous Boxing Day tsunami. Two strong earthquakes that occurred in the distant past in this region (between 6S and 1S) in 1797 (M8.2) and 1833 (M9.0) respectively are indicative of the high potential for very large destructive earthquakes to occur in this region with relatively long periods of quiescence in between. The results have been presented in the 5th ACES International Workshop in 2006 before the recent 2007 Sumatra earthquakes occurred which exactly fell into the predicted zone (see the following web site for ACES2006 and detailed presentation file through workshop agenda). The preliminary simulation results obtained so far have shown that there seem to be a few obvious events around the previously locked zone before it is totally ruptured, but apparently no indication of a giant earthquake similar to the 2004 M9 event in the near future which is believed to happen by several earthquake scientists. Further detailed simulations will be carried out and presented in the meeting.

  14. Dynamic Assessment of Seismic Risk (DASR) by Multi-parametric Observations: Preliminary Results of PRIME experiment within the PRE-EARTHQUAKES EU-FP7 Project

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Inan, S.; Jakowski, N.; Pulinets, S. A.; Romanov, A.; Filizzola, C.; Shagimuratov, I.; Pergola, N.; Ouzounov, D. P.; Papadopoulos, G. A.; Parrot, M.; Genzano, N.; Lisi, M.; Alparlsan, E.; Wilken, V.; Tsybukia, K.; Romanov, A.; Paciello, R.; Zakharenkova, I.; Romano, G.

    2012-12-01

    The integration of different observations together with the refinement of data analysis methods, is generally expected to improve our present knowledge of preparatory phases of earthquakes and of their possible precursors. This is also the main goal of PRE-EARTHQUAKES (Processing Russian and European EARTH observations for earthQUAKE precursors Studies) the FP7 Project which, to this aim, committed together, different international expertise and observational capabilities, in the last 2 years. In the learning phase of the project, different parameters (e.g. thermal anomalies, total electron content, radon concentration, etc.), measured from ground and satellite systems and analyzed by using different data analysis approaches, have been studied for selected geographic areas and specific seismic events in the past. Since July 2012 the PRIME (PRE-EARTHQUAKES Real-time Integration and Monitoring Experiment) started attempting to perform, on the base of independent observations collected and integrated in real-time through the PEG (PRE-EARTHQUAKES Geo-portal), a Dynamic Assessment of Seismic Risk (DASR) on selected geographic areas of Europe (Italy-Greece-Turkey) and Asia (Kamchatka, Sakhalin, Japan). In this paper, results so far achieved as well as the potential and opportunities they open for a worldwide Earthquake Observation System (EQuOS) - as a dedicated component of GEOSS (Global Earth Observation System of Systems) - will be presented.

  15. Development of a Low Cost Earthquake Early Warning System in Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Y. M.

    2017-12-01

    The National Taiwan University (NTU) developed an earthquake early warning (EEW) system for research purposes using low-cost accelerometers (P-Alert) since 2010. As of 2017, a total of 650 stations have been deployed and configured. The NTU system can provide earthquake information within 15 s of an earthquake occurrence. Thus, this system may provide early warnings for cities located more than 50 km from the epicenter. Additionally, the NTU system also has an onsite alert function that triggers a warning for incoming P-waves greater than a certain magnitude threshold, thus providing a 2-3 s lead time before peak ground acceleration (PGA) for regions close to an epicenter. Detailed shaking maps are produced by the NTU system within one or two minutes after an earthquake. Recently, a new module named ShakeAlarm has been developed. Equipped with real-time acceleration signals and the time-dependent anisotropic attenuation relationship of the PGA, ShakingAlarm can provide an accurate PGA estimation immediately before the arrival of the observed PGA. This unique advantage produces sufficient lead time for hazard assessment and emergency response, which is unavailable for traditional shakemap, which are based on only the PGA observed in real time. The performance of ShakingAlarm was tested with six M > 5.5 inland earthquakes from 2013 to 2016. Taking the 2016 M6.4 Meinong earthquake simulation as an example, the predicted PGA converges to a stable value and produces a predicted shake map and an isocontour map of the predicted PGA within 16 seconds of earthquake occurrence. Compared with traditional regional EEW system, ShakingAlarm can effectively identify possible damage regions and provide valuable early warning information (magnitude and PGA) for risk mitigation.

  16. Magnitude Dependent Seismic Quiescence of 2008 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Suyehiro, K.; Sacks, S. I.; Takanami, T.; Smith, D. E.; Rydelek, P. A.

    2014-12-01

    The change in seismicity leading to the Wenchuan Earthquake in 2008 (Mw 7.9) has been studied by various authors based on statistics and/or pattern recognitions (Huang, 2008; Yan et al., 2009; Chen and Wang, 2010; Yi et al., 2011). We show, in particular, that the magnitude-dependent seismic quiescence is observed for the Wenchuan earthquake and that it adds to other similar observations. Such studies on seismic quiescence prior to major earthquakes include 1982 Urakawa-Oki earthquake (M 7.1) (Taylor et al., 1992), 1994 Hokkaido-Toho-Oki earthquake (Mw=8.2) (Takanami et al., 1996), 2011 Tohoku earthquake (Mw=9.0) (Katsumata, 2011). Smith and Sacks (2013) proposed a magnitude-dependent quiescence based on a physical earthquake model (Rydelek and Sacks, 1995) and demonstrated the quiescence can be reproduced by the introduction of "asperities" (dilantacy hardened zones). Actual observations indicate the change occurs in a broader area than the eventual earthquake fault zone. In order to accept the explanation, we need to verify the model as the model predicts somewhat controversial features of earthquakes such as the magnitude dependent stress drop at lower magnitude range or the dynamically appearing asperities and repeating slips in some parts of the rupture zone. We show supportive observations. We will also need to verify the dilatancy diffusion to be taking place. So far, we only seem to have indirect evidences, which need to be more quantitatively substantiated.

  17. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  18. Predictability of extremes in non-linear hierarchically organized systems

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Soloviev, A.

    2011-12-01

    Understanding the complexity of non-linear dynamics of hierarchically organized systems progresses to new approaches in assessing hazard and risk of the extreme catastrophic events. In particular, a series of interrelated step-by-step studies of seismic process along with its non-stationary though self-organized behaviors, has led already to reproducible intermediate-term middle-range earthquake forecast/prediction technique that has passed control in forward real-time applications during the last two decades. The observed seismic dynamics prior to and after many mega, great, major, and strong earthquakes demonstrate common features of predictability and diverse behavior in course durable phase transitions in complex hierarchical non-linear system of blocks-and-faults of the Earth lithosphere. The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable analytical models, which leads to widespread practice of their deceptive application. The consequences of underestimation of seismic hazard propagate non-linearly into inflicted underestimation of risk and, eventually, into unexpected societal losses due to earthquakes and associated phenomena (i.e., collapse of buildings, landslides, tsunamis, liquefaction, etc.). The studies aimed at forecast/prediction of extreme events (interpreted as critical transitions) in geophysical and socio-economical systems include: (i) large earthquakes in geophysical systems of the lithosphere blocks-and-faults, (ii) starts and ends of economic recessions, (iii) episodes of a sharp increase in the unemployment rate, (iv) surge of the homicides in socio-economic systems. These studies are based on a heuristic search of phenomena preceding critical transitions and application of methodologies of pattern recognition of infrequent events. Any study of rare

  19. A Statistical Study of Total Electron Content Changes in the Ionosphere Prior to Earthquake Occurrences

    NASA Astrophysics Data System (ADS)

    Thomas, J. N.; Huard, J.; Masci, F.

    2015-12-01

    There are many published reports of anomalous changes in the ionosphere prior to large earthquakes. However, whether or not these ionospheric changes are reliable precursors that could be useful for earthquake prediction is controversial within the scientific community. To test a possible statistical relationship between the ionosphere and earthquakes, we compare changes in the total electron content (TEC) of the ionosphere with occurrences of M≥6.0 earthquakes globally for a multiyear period. We use TEC data from a global ionosphere map (GIM) and an earthquake list declustered for aftershocks. For each earthquake, we look for anomalous changes in TEC within ±30 days of the earthquake time and within 2.5° latitude and 5.0° longitude of the earthquake location (the spatial resolution of GIM). Our preliminary analysis, using global TEC and earthquake data for 2002-2010, has not found any statistically significant changes in TEC prior to earthquakes. Thus, we have found no evidence that would suggest that TEC changes are useful for earthquake prediction. Our results are discussed in the context of prior statistical and case studies. Namely, our results agree with Dautermann et al. (2007) who found no relationship between TEC changes and earthquakes in the San Andreas fault region. Whereas, our results disagree with Le et al. (2011) who found an increased rate in TEC anomalies within a few days before global earthquakes M≥6.0.

  20. Tokyo Metropolitan Earthquake Preparedness Project - A Progress Report

    NASA Astrophysics Data System (ADS)

    Hayashi, H.

    2010-12-01

    Munich Re once ranked that Tokyo metropolitan region, the capital of Japan, is the most vulnerable area for earthquake disasters, followed by San Francisco Bay Area, US and Osaka, Japan. Seismologists also predict that Tokyo metropolitan region may have at least one near-field earthquake with a probability of 70% for the next 30 years. Given this prediction, Japanese Government took it seriously to conduct damage estimations and revealed that, as the worst case scenario, if a7.3 magnitude earthquake under heavy winds as shown in the fig. 1, it would kill a total of 11,000 people and a total of direct and indirect losses would amount to 112,000,000,000,000 yen(1,300,000,000,000, 1=85yen) . In addition to mortality and financial losses, a total of 25 million people would be severely impacted by this earthquake in four prefectures. If this earthquake occurs, 300,000 elevators will be stopped suddenly, and 12,500 persons would be confined in them for a long time. Seven million people will come to use over 20,000 public shelters spread over the impacted area. Over one millions temporary housing units should be built to accommodate 4.6 million people who lost their dwellings. 2.5 million people will relocate to outside of the damaged area. In short, an unprecedented scale of earthquake disaster is expected and we must prepare for it. Even though disaster mitigation is undoubtedly the best solution, it is more realistic that the expected earthquake would hit before we complete this business. In other words, we must take into account another solution to make the people and the assets in this region more resilient for the Tokyo metropolitan earthquake. This is the question we have been tackling with for the last four years. To increase societal resilience for Tokyo metropolitan earthquake, we adopted a holistic approach to integrate both emergency response and long-term recovery. There are three goals for long-term recovery, which consists of Physical recovery, Economic

  1. Retrospective stress-forecasting of earthquakes

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  2. Regional and Local Glacial-Earthquake Patterns in Greenland

    NASA Astrophysics Data System (ADS)

    Olsen, K.; Nettles, M.

    2016-12-01

    Icebergs calved from marine-terminating glaciers currently account for up to half of the 400 Gt of ice lost annually from the Greenland ice sheet (Enderlin et al., 2014). When large capsizing icebergs ( 1 Gt of ice) calve, they produce elastic waves that propagate through the solid earth and are observed as teleseismically detectable MSW 5 glacial earthquakes (e.g., Ekström et al., 2003; Nettles & Ekström, 2010 Tsai & Ekström, 2007; Veitch & Nettles, 2012). The annual number of these events has increased dramatically over the past two decades. We analyze glacial earthquakes from 2011-2013, which expands the glacial-earthquake catalog by 50%. The number of glacial-earthquake solutions now available allows us to investigate regional patterns across Greenland and link earthquake characteristics to changes in ice dynamics at individual glaciers. During the years of our study Greenland's west coast dominated glacial-earthquake production. Kong Oscar Glacier, Upernavik Isstrøm, and Jakobshavn Isbræ all produced more glacial earthquakes during this time than in preceding years. We link patterns in glacial-earthquake production and cessation to the presence or absence of floating ice tongues at glaciers on both coasts of Greenland. The calving model predicts glacial-earthquake force azimuths oriented perpendicular to the calving front, and comparisons between seismic data and satellite imagery confirm this in most instances. At two glaciers we document force azimuths that have recently changed orientation and confirm that similar changes have occurred in the calving-front geometry. We also document glacial earthquakes at one previously quiescent glacier. Consistent with previous work, we model the glacial-earthquake force-time function as a boxcar with horizontal and vertical force components that vary synchronously. We investigate limitations of this approach and explore improvements that could lead to a more accurate representation of the glacial earthquake source.

  3. Transient triggering of near and distant earthquakes

    USGS Publications Warehouse

    Gomberg, J.; Blanpied, M.L.; Beeler, N.M.

    1997-01-01

    We demonstrate qualitatively that frictional instability theory provides a context for understanding how earthquakes may be triggered by transient loads associated with seismic waves from near and distance earthquakes. We assume that earthquake triggering is a stick-slip process and test two hypotheses about the effect of transients on the timing of instabilities using a simple spring-slider model and a rate- and state-dependent friction constitutive law. A critical triggering threshold is implicit in such a model formulation. Our first hypothesis is that transient loads lead to clock advances; i.e., transients hasten the time of earthquakes that would have happened eventually due to constant background loading alone. Modeling results demonstrate that transient loads do lead to clock advances and that the triggered instabilities may occur after the transient has ceased (i.e., triggering may be delayed). These simple "clock-advance" models predict complex relationships between the triggering delay, the clock advance, and the transient characteristics. The triggering delay and the degree of clock advance both depend nonlinearly on when in the earthquake cycle the transient load is applied. This implies that the stress required to bring about failure does not depend linearly on loading time, even when the fault is loaded at a constant rate. The timing of instability also depends nonlinearly on the transient loading rate, faster rates more rapidly hastening instability. This implies that higher-frequency and/or longer-duration seismic waves should increase the amount of clock advance. These modeling results and simple calculations suggest that near (tens of kilometers) small/moderate earthquakes and remote (thousands of kilometers) earthquakes with magnitudes 2 to 3 units larger may be equally effective at triggering seismicity. Our second hypothesis is that some triggered seismicity represents earthquakes that would not have happened without the transient load (i

  4. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  5. Postseismic deformation and stress changes following the 1819 Rann of Kachchh, India earthquake: Was the 2001 Bhuj earthquake a triggered event?

    USGS Publications Warehouse

    To, A.; Burgmann, R.; Pollitz, F.

    2004-01-01

    The 2001 Mw 7.6 Bhuj earthquake occurred in an intraplate region with rather unusual active seismicity, including an earlier major earthquake, the 1819 Rann of Kachchh earthquake (M7.7). We examine if static coseismic and transient postseismic deformation following the 1819 earthquake contributed to the enhanced seismicity in the region and the occurrence of the 2001 Bhuj earthquake, ???100 km away and almost two centuries later. Based on the Indian shield setting, great rupture depth of the 2001 event and lack of significant early postseismic deformation measured following the 2001 event, we infer that little viscous relaxation occurs in the lower crust and choose an upper mantle effective viscosity of 1019 Pas. The predicted Coulomb failure stress (DCFS) on the rupture plane of the 2001 event increased by more than 0.1 bar at 20 km depth, which is a small but possibly significant amount. Stress change from the 1819 event may have also affected the occurrence of other historic earthquakes in this region. We also evaluate the postseismic deformation and ??CFS in this region due to the 2001 event. Positive ??CFS from the 2001 event occur to the NW and SE of the Bhuj earthquake rupture. Copyright 2004 by the American Geophysical Union.

  6. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  7. Premonitory slip and tidal triggering of earthquakes

    USGS Publications Warehouse

    Lockner, D.A.; Beeler, N.M.

    1999-01-01

    We have conducted a series of laboratory simulations of earthquakes using granite cylinders containing precut bare fault surfaces at 50 MPa confining pressure. Axial shortening rates between 10-4 and 10-6 mm/s were imposed to simulate tectonic loading. Average loading rate was then modulated by the addition of a small-amplitude sine wave to simulate periodic loading due to Earth tides or other sources. The period of the modulating signal ranged from 10 to 10,000 s. For each combination of amplitude and period of the modulating signal, multiple stick-slip events were recorded to determine the degree of correlation between the timing of simulated earthquakes and the imposed periodic loading function. Over the range of parameters studied, the degree of correlation of earthquakes was most sensitive to the amplitude of the periodic loading, with weaker dependence on the period of oscillations and the average loading rate. Accelerating premonitory slip was observed in these experiments and is a controlling factor in determining the conditions under which correlated events occur. In fact, some form of delayed failure is necessary to produce the observed correlations between simulated earthquake timing and characteristics of the periodic loading function. The transition from strongly correlated to weakly correlated model earthquake populations occurred when the amplitude of the periodic loading was approximately 0.05 to 0.1 MPa shear stress (0.03 to 0.06 MPa Coulomb failure function). Lower-amplitude oscillations produced progressively lower correlation levels. Correlations between static stress increases and earthquake aftershocks are found to degrade at similar stress levels. Typical stress variations due to Earth tides are only 0.001 to 0.004 MPa, so that the lack of correlation between Earth tides and earthquakes is also consistent with our findings. A simple extrapolation of our results suggests that approximately 1% of midcrustal earthquakes should be correlated with

  8. Experimental results on rock resistivity and its applications in monitoring and predicting natural disasters

    NASA Astrophysics Data System (ADS)

    Zhou, Jianguo; Zhu, Tao; Tang, Baolin

    2017-04-01

    There have been many earthquakes occurring in Chinese Mainland. These earthquakes, especially large earthquakes, often cause immeasurable loss. For instance, the 2008 Wenchuan Ms8.0 earthquake killed 70, 000 people and caused 17, 000 people missing. It is well known that this earthquake was not predicted. Why? Were there no precursors? After analyzing the geo-electrical resistivity recording at Chengdu station which is only about 36 km to the epicenter, we find that resistivity had changed abnormally very significantly along NE direction but no outstanding abnormal changes had been observed along NW direction before the earthquake. Perhaps this non-consistent changes result in that this earthquake was not predicted. However, in another standpoint, can another observation way be found to supplement the current geo-electrical resistivity observation in Chinese Mainland in order to improve the probability of catching the precursor? This motivates us to conduct experiments in lab and field. Apparent resistivity data are acquired along three common-midpoint measuring lines during the fixed-rate uniaxial compression on two sets of dry man-made samples and a Magnetite sample. We construct the relative resistivity change images (RRCIs). Our results indicate that all RRCIs show a trending change with stress: with the increase of stress, the resistivity-decreased region (RDR) in the RRCIs shrinks/expands, while the resistivity-increased region (RIR) expands/shrinks gradually, which is in agreement with the field experimental results of earthquake monitoring (Feng et al., 2001). Our results encourage us to conclude that the trending changes in RRCI with stress could probably become a useful indicator in monitoring and predicting earthquakes, volcanic eruptions and large-scale geologic movements. This work is supported by National Natural Science Foundation of China (NSFC, Grant 41574083).

  9. Earthquake Forecasting in Northeast India using Energy Blocked Model

    NASA Astrophysics Data System (ADS)

    Mohapatra, A. K.; Mohanty, D. K.

    2009-12-01

    . The proposed process provides a more consistent model of gradual accumulation of strain and non-uniform release through large earthquakes and can be applied in the evaluation of seismic risk. The cumulative seismic energy released by major earthquakes throughout the period from 1897 to 2007 of last 110 years in the all the zones are calculated and plotted. The plot gives characteristics curve for each zone. Each curve is irregular, reflecting occasional high activity. The maximum earthquake energy available at a particular time in a given area is given by S. The difference between the theoretical upper limit given by S and the cumulative energy released up to that time is calculated to find out the maximum magnitude of an earthquake which can occur in future. Energy blocked of the three source regions are 1.35*1017 Joules, 4.25*1017 Joules and 0.12*1017 in Joules respectively for source zone 1, 2 and 3, as a supply for potential earthquakes in due course of time. The predicted maximum magnitude (mmax) obtained for each source zone AYZ, HZ, and SPZ are 8.2, 8.6, and 8.4 respectively by this model. This study is also consistent with the previous predicted results by other workers.

  10. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the

  11. Monitoring of ULF (ultra-low-frequency) Geomagnetic Variations Associated with Earthquakes

    PubMed Central

    Hayakawa, Masashi; Hattori, Katsumi; Ohta, Kenji

    2007-01-01

    ULF (ultra-low-frequency) electromagnetic emission is recently recognized as one of the most promising candidates for short-term earthquake prediction. This paper reviews previous convincing evidence on the presence of ULF emissions before a few large earthquakes. Then, we present our network of ULF monitoring in the Tokyo area by describing our ULF magnetic sensors and we finally present a few, latest results on seismogenic electromagnetic emissions for recent large earthquakes with the use of sophisticated signal processings.

  12. Awareness and understanding of earthquake hazards at school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  13. An Earthquake Rupture Forecast model for central Italy submitted to CSEP project

    NASA Astrophysics Data System (ADS)

    Pace, B.; Peruzza, L.

    2009-04-01

    We defined a seismogenic source model for central Italy and computed the relative forecast scenario, in order to submit the results to the CSEP (Collaboratory for the study of Earthquake Predictability, www.cseptesting.org) project. The goal of CSEP project is developing a virtual, distributed laboratory that supports a wide range of scientific prediction experiments in multiple regional or global natural laboratories, and Italy is the first region in Europe for which fully prospective testing is planned. The model we propose is essentially the Layered Seismogenic Source for Central Italy (LaSS-CI) we published in 2006 (Pace et al., 2006). It is based on three different layers of sources: the first one collects the individual faults liable to generate major earthquakes (M >5.5); the second layer is given by the instrumental seismicity analysis of the past two decades, which allows us to evaluate the background seismicity (M ~<5.0). The third layer utilizes all the instrumental earthquakes and the historical events not correlated to known structures (4.5earthquakes by Brownian passage time distribution. Beside the original model, updated earthquake rupture forecasts only for individual sources are released too, in the light of recent analyses (Peruzza et al., 2008; Zoeller et al., 2008). We computed forecasts based on the LaSS-CI model for two time-windows: 5 and 10 years. Each model to be tested defines a forecasted earthquake rate in magnitude bins of 0.1 unit steps in the range M5-9, for the periods 1st April 2009 to 1st April 2014, and 1st April 2009 to 1st April 2019. B. Pace, L. Peruzza, G. Lavecchia, and P. Boncio (2006) Layered Seismogenic Source

  14. Stress Drop and Depth Controls on Ground Motion From Induced Earthquakes

    NASA Astrophysics Data System (ADS)

    Baltay, A.; Rubinstein, J. L.; Terra, F. M.; Hanks, T. C.; Herrmann, R. B.

    2015-12-01

    Induced earthquakes in the central United States pose a risk to local populations, but there is not yet agreement on how to portray their hazard. A large source of uncertainty in the hazard arises from ground motion prediction, which depends on the magnitude and distance of the causative earthquake. However, ground motion models for induced earthquakes may be very different than models previously developed for either the eastern or western United States. A key question is whether ground motions from induced earthquakes are similar to those from natural earthquakes, yet there is little history of natural events in the same region with which to compare the induced ground motions. To address these problems, we explore how earthquake source properties, such as stress drop or depth, affect the recorded ground motion of induced earthquakes. Typically, due to stress drop increasing with depth, ground motion prediction equations model shallower events to have smaller ground motions, when considering the same absolute hypocentral distance to the station. Induced earthquakes tend to occur at shallower depths, with respect to natural eastern US earthquakes, and may also exhibit lower stress drops, which begs the question of how these two parameters interact to control ground motion. Can the ground motions of induced earthquakes simply be understood by scaling our known source-ground motion relations to account for the shallow depth or potentially smaller stress drops of these induced earthquakes, or is there an inherently different mechanism in play for these induced earthquakes? We study peak ground-motion velocity (PGV) and acceleration (PGA) from induced earthquakes in Oklahoma and Kansas, recorded by USGS networks at source-station distances of less than 20 km, in order to model the source effects. We compare these records to those in both the NGA-West2 database (primarily from California) as well as NGA-East, which covers the central and eastern United States and Canada

  15. Return to work for severely injured survivors of the Christchurch earthquake: influences in the first 2 years.

    PubMed

    Nunnerley, Joanne; Dunn, Jennifer; McPherson, Kathryn; Hooper, Gary; Woodfield, Tim

    2016-01-01

    This study looked at the influences on the return to work (RTW) in the first 2 years for people severely injured in the 22 February 2011 Christchurch earthquake. We used a constructivist grounded theory approach using semi-structured interviews to collect data from 14 people injured in the earthquake. Analysis elicited three themes that appeared to influence the process of RTW following the Christchurch earthquake. Living the earthquake experience, the individual's experiences of the earthquake and how their injury framed their expectations; rebuilding normality, the desire of the participants to return to life as it was; while dealing with the secondary effects of the earthquake includes the earthquake specific effects which were both barriers and facilitators to returning to work. The consequences of the earthquake impacted on experience, process and outcome of RTW for those injured in the Christchurch Earthquake. Work and RTW appeared key tools to enhance recovery after serious injury following the earthquake. The altered physical, social and economic environment must be considered when working on the return to work (RTW) of individuals with earthquake injuries. Providing tangible emotional and social support so injured earthquake survivors feel safe in their workplace may facilitate RTW. Engaging early with employers may assist the RTW of injured earthquake survivors.

  16. Strong Ground Motion Analysis and Afterslip Modeling of Earthquakes near Mendocino Triple Junction

    NASA Astrophysics Data System (ADS)

    Gong, J.; McGuire, J. J.

    2017-12-01

    The Mendocino Triple Junction (MTJ) is one of the most seismically active regions in North America in response to the ongoing motions between North America, Pacific and Gorda plates. Earthquakes near the MTJ come from multiple types of faults due to the interaction boundaries between the three plates and the strong internal deformation within them. Understanding the stress levels that drive the earthquake rupture on the various types of faults and estimating the locking state of the subduction interface are especially important for earthquake hazard assessment. However due to lack of direct offshore seismic and geodetic records, only a few earthquakes' rupture processes have been well studied and the locking state of the subducted slab is not well constrained. In this study we first use the second moment inversion method to study the rupture process of the January 28, 2015 Mw 5.7 strike slip earthquake on Mendocino transform fault using strong ground motion records from Cascadia Initiative community experiment as well as onshore seismic networks. We estimate the rupture dimension to be of 6 km by 3 km and a stress drop of 7 MPa on the transform fault. Next we investigate the frictional locking state on the subduction interface through afterslip simulation based on coseismic rupture models of this 2015 earthquake and a Mw 6.5 intraplate eathquake inside Gorda plate whose slip distribution is inverted using onshore geodetic network in previous study. Different depths for velocity strengthening frictional properties to start at the downdip of the locked zone are used to simulate afterslip scenarios and predict the corresponding surface deformation (GPS) movements onshore. Our simulations indicate that locking depth on the slab surface is at least 14 km, which confirms that the next M8 earthquake rupture will likely reach the coastline and strong shaking should be expected near the coast.

  17. Insights from the Source Physics Experiments on P/S Amplitude Ratio Methods of Identifying Explosions in a Background of Earthquakes

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Xu, H.; Pasyanos, M. E.; Pyle, M. L.; Matzel, E.; Mellors, R. J.; Hauk, T. F.

    2012-12-01

    It is well established empirically that regional distance (200-1600 km) amplitude ratios of seismic P-to-S waves at sufficiently high frequencies (~>2 Hz) can identify explosions among a background of natural earthquakes. However the physical basis for the generation of explosion S-waves, and therefore the predictability of this P/S technique as a function of event properties such as size, depth, geology and path, remains incompletely understood. A goal of the Source Physics Experiments (SPE) at the Nevada National Security Site (NNSS, formerly the Nevada Test Site (NTS)) is to improve our physical understanding of the mechanisms of explosion S-wave generation and advance our ability to numerically model and predict them. Current models of explosion P/S values suggest they are frequency dependent with poor performance below the source corner frequencies and good performance above. This leads to expectations that small magnitude explosions might require much higher frequencies (>10 Hz) to identify them. Interestingly the 1-ton chemical source physics explosions SPE2 and SPE3 appear to discriminate well from background earthquakes in the frequency band 6-8 Hz, where P and S signals are visible at the NVAR array located near Mina, NV about 200 km away. NVAR is a primary seismic station in the International Monitoring System (IMS), part of the Comprehensive nuclear-Test-Ban Treaty (CTBT). The NVAR broadband element NV31 is co-located with the LLNL station MNV that recorded many NTS nuclear tests, allowing the comparison. We find the small SPE explosions in granite have similar Pn/Lg values at 6-8 Hz as the past nuclear tests mainly in softer rocks. We are currently examining a number of other stations in addition to NVAR, including the dedicated SPE stations that recorded the SPE explosions at much closer distances with very high sample rates, in order to better understand the observed frequency dependence as compared with the model predictions. We plan to use these

  18. Ionospheric precursors to large earthquakes: A case study of the 2011 Japanese Tohoku Earthquake

    NASA Astrophysics Data System (ADS)

    Carter, B. A.; Kellerman, A. C.; Kane, T. A.; Dyson, P. L.; Norman, R.; Zhang, K.

    2013-09-01

    Researchers have reported ionospheric electron distribution abnormalities, such as electron density enhancements and/or depletions, that they claimed were related to forthcoming earthquakes. In this study, the Tohoku earthquake is examined using ionosonde data to establish whether any otherwise unexplained ionospheric anomalies were detected in the days and hours prior to the event. As the choices for the ionospheric baseline are generally different between previous works, three separate baselines for the peak plasma frequency of the F2 layer, foF2, are employed here; the running 30-day median (commonly used in other works), the International Reference Ionosphere (IRI) model and the Thermosphere Ionosphere Electrodynamic General Circulation Model (TIE-GCM). It is demonstrated that the classification of an ionospheric perturbation is heavily reliant on the baseline used, with the 30-day median, the IRI and the TIE-GCM generally underestimating, approximately describing and overestimating the measured foF2, respectively, in the 1-month period leading up to the earthquake. A detailed analysis of the ionospheric variability in the 3 days before the earthquake is then undertaken, where a simultaneous increase in foF2 and the Es layer peak plasma frequency, foEs, relative to the 30-day median was observed within 1 h before the earthquake. A statistical search for similar simultaneous foF2 and foEs increases in 6 years of data revealed that this feature has been observed on many other occasions without related seismic activity. Therefore, it is concluded that one cannot confidently use this type of ionospheric perturbation to predict an impending earthquake. It is suggested that in order to achieve significant progress in our understanding of seismo-ionospheric coupling, better account must be taken of other known sources of ionospheric variability in addition to solar and geomagnetic activity, such as the thermospheric coupling.

  19. [Earthquakes in El Salvador].

    PubMed

    de Ville de Goyet, C

    2001-02-01

    The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

  20. Prediction monitoring and evaluation program; a progress report

    USGS Publications Warehouse

    Hunter, R.N.; Derr, J.S.

    1978-01-01

    As part of an attempt to separate useful predictions from inaccurate guesses, we have kept score on earthquake predictions from all sources brought to our attention over the past year and a half. The program was outlined in "Earthquake Prediction;Fact and Fallacy" by Roger N. Hunter (Earthquake Information Bulletin, vol. 8, no. 5, September-October 1976, p. 24-25). The program attracted a great deal of public attention, and, as a result, our files now contain over 2500 predictions from more than 230 different people. 

  1. MCEER, from Earthquake Engineering to Extreme Events | Home Page

    Science.gov Websites

    Center Report Series Education Education Home Bridge Engineering Guest Speaker Series Connecte²d Teaching CSEE Graduate Student Poster Competition Earthquake Engineering Education in Haiti Earthquakes : FAQ's Engineering Seminar Series K-12 STEM Education National Engineers Week Research Experiences for

  2. Why earthquakes correlate weakly with the solid Earth tides: Effects of periodic stress on the rate and probability of earthquake occurrence

    USGS Publications Warehouse

    Beeler, N.M.; Lockner, D.A.

    2003-01-01

    We provide an explanation why earthquake occurrence does not correlate well with the daily solid Earth tides. The explanation is derived from analysis of laboratory experiments in which faults are loaded to quasiperiodic failure by the combined action of a constant stressing rate, intended to simulate tectonic loading, and a small sinusoidal stress, analogous to the Earth tides. Event populations whose failure times correlate with the oscillating stress show two modes of response; the response mode depends on the stressing frequency. Correlation that is consistent with stress threshold failure models, e.g., Coulomb failure, results when the period of stress oscillation exceeds a characteristic time tn; the degree of correlation between failure time and the phase of the driving stress depends on the amplitude and frequency of the stress oscillation and on the stressing rate. When the period of the oscillating stress is less than tn, the correlation is not consistent with threshold failure models, and much higher stress amplitudes are required to induce detectable correlation with the oscillating stress. The physical interpretation of tn is the duration of failure nucleation. Behavior at the higher frequencies is consistent with a second-order dependence of the fault strength on sliding rate which determines the duration of nucleation and damps the response to stress change at frequencies greater than 1/tn. Simple extrapolation of these results to the Earth suggests a very weak correlation of earthquakes with the daily Earth tides, one that would require >13,000 earthquakes to detect. On the basis of our experiments and analysis, the absence of definitive daily triggering of earthquakes by the Earth tides requires that for earthquakes, tn exceeds the daily tidal period. The experiments suggest that the minimum typical duration of earthquake nucleation on the San Andreas fault system is ???1 year.

  3. Earthquake Early Warning and Public Policy: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Goltz, J. D.; Bourque, L.; Tierney, K.; Riopelle, D.; Shoaf, K.; Seligson, H.; Flores, P.

    2003-12-01

    warning systems and equity issues associated with possible differential access to warnings. Finally, we will review the status of legal authorities and liabilities faced by organizations that assume various warning system roles and possible approaches to setting up a pilot project to introduce early warning. Our presentation will suggest that introducing an early warning system requires multi-disciplinary and multi-agency cooperation and thoughtful discussion among organizations likely to be providers and participants in an early warning system. Recalling our experience with earthquake prediction, we will look at early warning as a promising but unproven technology and recommend moving forward with caution and patience.

  4. Predictors of psychological resilience amongst medical students following major earthquakes.

    PubMed

    Carter, Frances; Bell, Caroline; Ali, Anthony; McKenzie, Janice; Boden, Joseph M; Wilkinson, Timothy; Bell, Caroline

    2016-05-06

    To identify predictors of self-reported psychological resilience amongst medical students following major earthquakes in Canterbury in 2010 and 2011. Two hundred and fifty-three medical students from the Christchurch campus, University of Otago, were invited to participate in an electronic survey seven months following the most severe earthquake. Students completed the Connor-Davidson Resilience Scale, the Depression, Anxiety and Stress Scale, the Post-traumatic Disorder Checklist, the Work and Adjustment Scale, and the Eysenck Personality Questionnaire. Likert scales and other questions were also used to assess a range of variables including demographic and historical variables (eg, self-rated resilience prior to the earthquakes), plus the impacts of the earthquakes. The response rate was 78%. Univariate analyses identified multiple variables that were significantly associated with higher resilience. Multiple linear regression analyses produced a fitted model that was able to explain 35% of the variance in resilience scores. The best predictors of higher resilience were: retrospectively-rated personality prior to the earthquakes (higher extroversion and lower neuroticism); higher self-rated resilience prior to the earthquakes; not being exposed to the most severe earthquake; and less psychological distress following the earthquakes. Psychological resilience amongst medical students following major earthquakes was able to be predicted to a moderate extent.

  5. Toward a Better Nutritional Aiding in Disasters: Relying on Lessons Learned during the Bam Earthquake.

    PubMed

    Nekouie Moghadam, Mahmoud; Amiresmaieli, Mohammadreza; Hassibi, Mohammad; Doostan, Farideh; Khosravi, Sajad

    2017-08-01

    Introduction Examining various problems in the aftermath of disasters is very important to the disaster victims. Managing and coordinating food supply and its distribution among the victims is one of the most important problems after an earthquake. Therefore, the purpose of this study was to recognize problems and experiences in the field of nutritional aiding during an earthquake. This qualitative study was of phenomenological type. Using the purposive sampling method, 10 people who had experienced nutritional aiding during the Bam Earthquake (Iran; 2003) were interviewed. Colaizzi's method of analysis was used to analyze interview data. The findings of this study identified four main categories and 19 sub-categories concerning challenges in the nutritional aiding during the Bam Earthquake. The main topics included managerial, aiding, infrastructural, and administrative problems. The major problems in nutritional aiding include lack of prediction and development of a specific program of suitable nutritional pattern and nutritional assessment of the victims in critical conditions. Forming specialized teams, educating team members about nutrition, and making use of experts' knowledge are the most important steps to resolve these problems in the critical conditions; these measures are the duties of the relevant authorities. Nekouie Moghadam M , Amiresmaieli M , Hassibi M , Doostan F , Khosravi S . Toward a better nutritional aiding in disasters: relying on lessons learned during the Bam Earthquake. Prehosp Disaster Med. 2017;32(4):382-386.

  6. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    NASA Astrophysics Data System (ADS)

    Yue, Z.

    2013-12-01

    A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and

  7. Earthquake early warning using P-waves that appear after initial S-waves

    NASA Astrophysics Data System (ADS)

    Kodera, Y.

    2017-12-01

    As measures for underprediction for large earthquakes with finite faults and overprediction for multiple simultaneous earthquakes, Hoshiba (2013), Hoshiba and Aoki (2015), and Kodera et al. (2016) proposed earthquake early warning (EEW) methods that directly predict ground motion by computing the wave propagation of observed ground motion. These methods are expected to predict ground motion with a high accuracy even for complicated scenarios because these methods do not need source parameter estimation. On the other hand, there is room for improvement in their rapidity because they predict strong motion prediction mainly based on the observation of S-waves and do not explicitly use P-wave information available before the S-waves. In this research, we propose a real-time P-wave detector to incorporate P-wave information into these wavefield-estimation approaches. P-waves within a few seconds from the P-onsets are commonly used in many existing EEW methods. In addition, we focus on P-waves that may appear in the later part of seismic waves. Kurahashi and Irikura (2013) mentioned that P-waves radiated from strong motion generation areas (SMGAs) were recognizable after S-waves of the initial rupture point in the 2011 off the Pacific coast of Tohoku earthquake (Mw 9.0) (the Tohoku-oki earthquake). Detecting these P-waves would enhance the rapidity of prediction for the peak ground motion generated by SMGAs. We constructed a real-time P-wave detector that uses a polarity analysis. Using acceleration records in boreholes of KiK-net (band-pass filtered around 0.5-10 Hz with site amplification correction), the P-wave detector performed the principal component analysis with a sliding window of 4 s and calculated P-filter values (e.g. Ross and Ben-Zion, 2014). The application to the Tohoku-oki earthquake (Mw 9.0) showed that (1) peaks of P-filter that corresponded to SMGAs appeared in several stations located near SMGAs and (2) real-time seismic intensities (Kunugi et al

  8. Shaking intensity from injection-induced versus tectonic earthquakes in the central-eastern United States

    USGS Publications Warehouse

    Hough, Susan E.

    2015-01-01

    Although instrumental recordings of earthquakes in the central and eastern United States (CEUS) remain sparse, the U. S. Geological Survey's “Did you feel it?” (DYFI) system now provides excellent characterization of shaking intensities caused by induced and tectonic earthquakes. Seventeen CEUS events are considered between 2013 and 2015. It is shown that for 15 events, observed intensities at epicentral distances greater than ≈ 10 km are lower than expected given a published intensity-prediction equation for the region. Using simple published relations among intensity, magnitude, and stress drop, the results suggest that 15 of the 17 events have low stress drop. For those 15 events, intensities within ≈ 10-km epicentral distance are closer to predicted values, which can be explained as a consequence of relatively shallow source depths. The results suggest that those 15 events, most of which occurred in areas where induced earthquakes have occurred previously, were likely induced. Although moderate injection-induced earthquakes in the central and eastern United States will be felt widely because of low regional attenuation, the damage from shallow earthquakes induced by injection will be more localized to event epicenters than shaking tectonic earthquakes, which tend to be somewhat deeper. Within approximately 10 km of the epicenter, intensities are generally commensurate with predicted levels expected for the event magnitude.

  9. Prospective testing of Coulomb short-term earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  10. Gravitational body forces focus North American intraplate earthquakes

    USGS Publications Warehouse

    Levandowski, William Brower; Zellman, Mark; Briggs, Richard

    2017-01-01

    Earthquakes far from tectonic plate boundaries generally exploit ancient faults, but not all intraplate faults are equally active. The North American Great Plains exemplify such intraplate earthquake localization, with both natural and induced seismicity generally clustered in discrete zones. Here we use seismic velocity, gravity and topography to generate a 3D lithospheric density model of the region; subsequent finite-element modelling shows that seismicity focuses in regions of high-gravity-derived deviatoric stress. Furthermore, predicted principal stress directions generally align with those observed independently in earthquake moment tensors and borehole breakouts. Body forces therefore appear to control the state of stress and thus the location and style of intraplate earthquakes in the central United States with no influence from mantle convection or crustal weakness necessary. These results show that mapping where gravitational body forces encourage seismicity is crucial to understanding and appraising intraplate seismic hazard.

  11. Gravitational body forces focus North American intraplate earthquakes

    PubMed Central

    Levandowski, Will; Zellman, Mark; Briggs, Rich

    2017-01-01

    Earthquakes far from tectonic plate boundaries generally exploit ancient faults, but not all intraplate faults are equally active. The North American Great Plains exemplify such intraplate earthquake localization, with both natural and induced seismicity generally clustered in discrete zones. Here we use seismic velocity, gravity and topography to generate a 3D lithospheric density model of the region; subsequent finite-element modelling shows that seismicity focuses in regions of high-gravity-derived deviatoric stress. Furthermore, predicted principal stress directions generally align with those observed independently in earthquake moment tensors and borehole breakouts. Body forces therefore appear to control the state of stress and thus the location and style of intraplate earthquakes in the central United States with no influence from mantle convection or crustal weakness necessary. These results show that mapping where gravitational body forces encourage seismicity is crucial to understanding and appraising intraplate seismic hazard. PMID:28211459

  12. Earthquake precursory events around epicenters and local active faults

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, H.; Mansor, S. B.; Haydari Azad, F.

    2013-05-01

    The chain of underground events which are triggered by seismic activities and physical/chemical interactions prior to a shake in the earth's crust may produce surface and above surface phenomena. During the past decades many researchers have been carried away to seek the possibility of short term earthquake prediction using remote sensing data. Currently, there are several theories about the preparation stages of earthquakes most of which stress on raises in heat and seismic waves as the main signs of an impending earthquakes. Their differences only lie in the secondary phenomena which are triggered by these events. In any case, with the recent advances in remote sensing sensors and techniques now we are able to provide wider, more accurate monitoring of land, ocean and atmosphere. Among all theoretical factors, changes in Surface Latent Heat Flux (SLHF), Sea & Land Surface Temperature (SST & LST) and surface chlorophyll-a are easier to record from earth observing satellites. SLHF is the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere. Abnormal variations in this factor have been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. In case of oceanic earthquakes, higher temperature at the ocean beds may lead to higher amount of Chl-a on the sea surface. On the other hand, it has been also said that the leak of Radon gas which occurs as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT). We have chosen to perform a statistical, long-term, and short-term approach by considering the reoccurrence intervals of past

  13. Strong ground motion of the 2016 Kumamoto earthquake

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Kunugi, T.; Suzuki, W.; Kubo, H.; Morikawa, N.; Fujiwara, H.

    2016-12-01

    The 2016 Kumamoto earthquake that is composed of Mw 6.1 and Mw 7.1 earthquakes respectively occurred in the Kumamoto region at 21:26 on April 14 and 28 hours later at 1:25 on April 16, 2016 (JST). These earthquakes are considered to rupture mainly the Hinagu fault zone for the Mw 6.1 event and the Futagawa fault zone for the Mw 7.1 event, respectively, where the Headquarter for Earthquake Research Promotion performed the long-term evaluation as well as seismic hazard assessment prior to the 2016 Kumamoto earthquake. Strong shakings with seismic intensity 7 in the JMA scale were observed at four times in total: Mashiki town for the Mw 6.1 and Mw 7.1 events, Nishihara village for the Mw 7.1 event, and NIED/KiK-net Mashiki (KMMH16) for the Mw 7.1 event. KiK-net Mashiki (KMMH16) recorded peak ground acceleration more than 1000 cm/s/s, and Nishihara village recorded peak ground velocity more than 250 cm/s. Ground motions were observed wider area for the Mw 7.1 event than the Mw 6.1 event. Peak ground accelerations and peak ground velocities of K-NET/KiK-net stations are consistent with the ground motion prediction equations by Si and Midorikawa (1999). Peak ground velocities at longer distance than 200 km attenuate slowly, which can be attributed to the large Love wave with a dominant period around 10 seconds. 5%-damped pseudo spectral velocity of the Mashiki town shows a peak at period of 1-2 s that exceeds ground motion response of JR Takatori of the 1995 Kobe earthquake and the Kawaguchi town of the 2004 Chuetsu earthquake. 5%-damped pseudo spectral velocity of the Nishihara village shows 350 cm/s peak at period of 3-4 s that is similar to the several stations in Kathmandu basin by Takai et al. (2016) during the 2015 Gorkha earthquake in Nepal. Ground motions at several stations in Oita exceed the ground motion prediction equations due to an earthquake induced by the Mw 7.1 event. Peak ground accelerations of K-NET Yufuin (OIT009) records 90 cm/s/s for the Mw 7

  14. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

    USGS Publications Warehouse

    Coordinated by Bakun, William H.; Prescott, William H.

    1993-01-01

    Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

  15. Pore-fluid migration and the timing of the 2005 M8.7 Nias earthquake

    USGS Publications Warehouse

    Hughes, K.L.H.; Masterlark, Timothy; Mooney, W.D.

    2011-01-01

    Two great earthquakes have occurred recently along the Sunda Trench, the 2004 M9.2 Sumatra-Andaman earthquake and the 2005 M8.7 Nias earthquake. These earthquakes ruptured over 1600 km of adjacent crust within 3 mo of each other. We quantitatively present poroelastic deformation analyses suggesting that postseismic fluid flow and recovery induced by the Sumatra-Andaman earthquake advanced the timing of the Nias earthquake. Simple back-slip simulations indicate that the megapascal (MPa)-scale pore-pressure recovery is equivalent to 7 yr of interseismic Coulomb stress accumulation near the Nias earthquake hypocenter, implying that pore-pressure recovery of the Sumatra-Andaman earthquake advanced the timing of the Nias earthquake by ~7 yr. That is, in the absence of postseismic pore-pressure recovery, we predict that the Nias earthquake would have occurred in 2011 instead of 2005. ?? 2011 Geological Society of America.

  16. Stress drops of induced and tectonic earthquakes in the central United States are indistinguishable.

    PubMed

    Huang, Yihe; Ellsworth, William L; Beroza, Gregory C

    2017-08-01

    Induced earthquakes currently pose a significant hazard in the central United States, but there is considerable uncertainty about the severity of their ground motions. We measure stress drops of 39 moderate-magnitude induced and tectonic earthquakes in the central United States and eastern North America. Induced earthquakes, more than half of which are shallower than 5 km, show a comparable median stress drop to tectonic earthquakes in the central United States that are dominantly strike-slip but a lower median stress drop than that of tectonic earthquakes in the eastern North America that are dominantly reverse-faulting. This suggests that ground motion prediction equations developed for tectonic earthquakes can be applied to induced earthquakes if the effects of depth and faulting style are properly considered. Our observation leads to the notion that, similar to tectonic earthquakes, induced earthquakes are driven by tectonic stresses.

  17. Earthquake Potential in Myanmar

    NASA Astrophysics Data System (ADS)

    Aung, Hla Hla

    Myanmar region is generally believed to be an area of high earthquake potential from the point of view of seismic activity which has been low compared to the surrounding regions like Indonesia, China, and Pakistan. Geoscientists and seismologists predicted earthquakes to occur in the area north of the Sumatra-Andaman Islands, i.e. the southwest and west part of Myanmar. Myanmar tectonic setting relative to East and SE Asia is rather peculiar and unique with different plate tectonic models but similar to the setting of western part of North America. Myanmar crustal blocks are caught within two lithospheric plates of India and Indochina experiencing oblique subduction with major dextral strike-slip faulting of the Sagaing fault. Seismic tomography and thermal structure of India plate along the Sunda subduction zone vary from south to north. Strong partitioning in central Andaman basin where crustal fragmentation and northward dispersion of Burma plate by back-arc spreading mechanism has been operating since Neogene. Northward motion of Burma plate relative to SE Asia would dock against the major continent further north and might have caused the accumulation of strain which in turn will be released as earthquakes in the future.

  18. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  19. Proposal as to Efficient Collection and Exploitation of Earthquake Damage Information and Verification by Field Experiment at Toyohashi City

    NASA Astrophysics Data System (ADS)

    Zama, Shinsaku; Endo, Makoto; Takanashi, Ken'ichi; Araiba, Kiminori; Sekizawa, Ai; Hosokawa, Masafumi; Jeong, Byeong-Pyo; Hisada, Yoshiaki; Murakami, Masahiro

    Based on the earlier study result that the gathering of damage information can be quickly achieved in a municipality with a smaller population, it is proposed that damage information is gathered and analyzed using an area roughly equivalent to a primary school district as a basic unit. The introduction of this type of decentralized system is expected to quickly gather important information on each area. The information gathered by these communal disaster prevention bases is sent to the disaster prevention headquarters which in turn feeds back more extensive information over a wider area to the communal disaster prevention bases. Concrete systems have been developed according to the above mentioned framework, and we performed large-scale experiments on simulating disaster information collection, transmission and on utilization for smooth responses against earthquake disaster with collaboration from Toyohashi City, Aichi Prefecture, where is considered to suffer extensive damage from the Tokai and Tonankai Earthquakes with very high probability of the occurrence. Using disaster information collection/transmission equipments composed of long-distance wireless LAN, a notebook computer, a Web camera and an IP telephone, city staffs could easily input and transmit the information such as fire, collapsed houses and impassable roads, which were collected by the inhabitants participated in the experiment. Headquarters could confirm such information on the map automatically plotted, and also state of each disaster-prevention facility by means of Web-cameras and IP telephones. Based on the damage information, fire-spreading, evaluation, and traffic simulations were automatically executed at the disaster countermeasure office and their results were displayed on the large screen to utilize for making decisions such as residents' evacuation. These simulated results were simultaneously displayed at each disaster-prevention facility and were served to make people understand the

  20. Fault failure with moderate earthquakes

    USGS Publications Warehouse

    Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

    1987-01-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

  1. Statistical earthquake focal mechanism forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2014-04-01

    Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

  2. Izmit, Turkey 1999 Earthquake Interferogram

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul.

  3. Investigating Lushan Earthquake Victims' Individual Behavior Response and Rescue Organization.

    PubMed

    Kang, Peng; Lv, Yipeng; Deng, Qiangyu; Liu, Yuan; Zhang, Yi; Liu, Xu; Zhang, Lulu

    2017-12-11

    Research concerning the impact of earthquake victims' individual behavior and its association with earthquake-related injuries is lacking. This study examined this relationship along with effectiveness of earthquake rescue measures. The six most severely destroyed townships during the Lushan earthquake were examined; 28 villages and three earthquake victims' settlement camp areas were selected as research areas. Inclusion criteria comprised living in Lushan county for a longtime, living in Lushan county during the 2013 Lushan earthquake, and having one's home destroyed. Earthquake victims with an intellectual disability or communication problems were excluded. The earthquake victims (N (number) = 5165, male = 2396) completed a questionnaire (response rate: 94.7%). Among them, 209 were injured (5.61%). Teachers (p < 0.0001, OR (odds ratios) = 3.33) and medical staff (p = 0.001, OR = 4.35) were more vulnerable to the earthquake than were farmers. Individual behavior was directly related to injuries, such as the first reaction after earthquake and fear. There is an obvious connection between earthquake-related injury and individual behavior characteristics. It is strongly suggested that victims receive mental health support from medical practitioners and the government to minimize negative effects. The initial reaction after an earthquake also played a vital role in victims' trauma; therefore, earthquake-related experience and education may prevent injuries. Self-aid and mutual help played key roles in emergency, medical rescue efforts.

  4. Dense Array Studies of Volcano-Tectonic and Long-Period Earthquakes Beneath Mount St. Helens

    NASA Astrophysics Data System (ADS)

    Glasgow, M. E.; Hansen, S. M.; Schmandt, B.; Thomas, A.

    2017-12-01

    A 904 single-component 10-Hz geophone array deployed within 15 km of Mount St. Helens (MSH) in 2014 recorded continuously for two-weeks. Automated reverse-time imaging (RTI) was used to generate a catalog of 212 earthquakes. Among these, two distinct types of upper crustal (<8 km) earthquakes were classified. Volcano-tectonic (VT) and long-period (LP) earthquakes were identified using analysis of array spectrograms, envelope functions, and velocity waveforms. To remove analyst subjectivity, quantitative classification criteria were developed based on the ratio of power in high and low frequency bands and coda duration. Prior to the 2014 experiment, upper crustal LP earthquakes had only been reported at MSH during volcanic activity. Subarray beamforming was used to distinguish between LP earthquakes and surface generated LP signals, such as rockfall. This method confirmed 16 LP signals with horizontal velocities exceeding that of upper crustal P-wave velocities, which requires a subsurface hypocenter. LP and VT locations overlap in a cluster slightly east of the summit crater from 0-5 km below sea level. LP displacement spectra are similar to simple theoretical predictions for shear failure except that they have lower corner frequencies than VT earthquakes of similar magnitude. The results indicate a distinct non-resonant source for LP earthquakes which are located in the same source volume as some VT earthquakes (within hypocenter uncertainty of 1 km or less). To further investigate MSH microseismicity mechanisms, a 142 three-component (3-C) 5 Hz geophone array will record continuously for one month at MSH in Fall 2017 providing a unique dataset for a volcano earthquake source study. This array will help determine if LP occurrence in 2014 was transient or if it is still ongoing. Unlike the 2014 array, approximately 50 geophones will be deployed in the MSH summit crater directly over the majority of seismicity. RTI will be used to detect and locate earthquakes by

  5. Earthquake Preparedness Among Japanese Hemodialysis Patients in Prefectures Heavily Damaged by the 2011 Great East Japan Earthquake.

    PubMed

    Sugisawa, Hidehiro; Shimizu, Yumiko; Kumagai, Tamaki; Sugisaki, Hiroaki; Ohira, Seiji; Shinoda, Toshio

    2017-08-01

    The purpose of this study was to explore the factors related to earthquake preparedness in Japanese hemodialysis patients. We focused on three aspects of the related factors: health condition factors, social factors, and the experience of disasters. A mail survey of all the members of the Japan Association of Kidney Disease Patients in three Japanese prefectures (N = 4085) was conducted in March, 2013. We obtained 1841 valid responses for analysis. The health factors covered were: activities of daily living (ADL), mental distress, primary renal diseases, and the duration of dialysis. The social factors were: socioeconomic status, family structure, informational social support, and the provision of information regarding earthquake preparedness from dialysis facilities. The results show that the average percentage of participants that had met each criterion of earthquake preparedness in 2013 was 53%. Hemodialysis patients without disabled ADL, without mental distress, and requiring longer periods of dialysis, were likely to meet more of the earthquake preparedness criteria. Hemodialysis patients who had received informational social support from family or friends, had lived with spouse and children in comparison to living alone, and had obtained information regarding earthquake preparedness from dialysis facilities, were also likely to meet more of the earthquake preparedness criteria. © 2017 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  6. Directivity in NGA earthquake ground motions: Analysis using isochrone theory

    USGS Publications Warehouse

    Spudich, P.; Chiou, B.S.J.

    2008-01-01

    We present correction factors that may be applied to the ground motion prediction relations of Abrahamson and Silva, Boore and Atkinson, Campbell and Bozorgnia, and Chiou and Youngs (all in this volume) to model the azimuthally varying distribution of the GMRotI50 component of ground motion (commonly called 'directivity') around earthquakes. Our correction factors may be used for planar or nonplanar faults having any dip or slip rake (faulting mechanism). Our correction factors predict directivity-induced variations of spectral acceleration that are roughly half of the strike-slip variations predicted by Somerville et al. (1997), and use of our factors reduces record-to-record sigma by about 2-20% at 5 sec or greater period. ?? 2008, Earthquake Engineering Research Institute.

  7. Prompt gravity signal induced by the 2011 Tohoku-Oki earthquake

    PubMed Central

    Montagner, Jean-Paul; Juhel, Kévin; Barsuglia, Matteo; Ampuero, Jean Paul; Chassande-Mottin, Eric; Harms, Jan; Whiting, Bernard; Bernard, Pascal; Clévédé, Eric; Lognonné, Philippe

    2016-01-01

    Transient gravity changes are expected to occur at all distances during an earthquake rupture, even before the arrival of seismic waves. Here we report on the search of such a prompt gravity signal in data recorded by a superconducting gravimeter and broadband seismometers during the 2011 Mw 9.0 Tohoku-Oki earthquake. During the earthquake rupture, a signal exceeding the background noise is observed with a statistical significance higher than 99% and an amplitude of a fraction of μGal, consistent in sign and order of magnitude with theoretical predictions from a first-order model. While prompt gravity signal detection with state-of-the-art gravimeters and seismometers is challenged by background seismic noise, its robust detection with gravity gradiometers under development could open new directions in earthquake seismology, and overcome fundamental limitations of current earthquake early-warning systems imposed by the propagation speed of seismic waves. PMID:27874858

  8. Prompt gravity signal induced by the 2011 Tohoku-Oki earthquake.

    PubMed

    Montagner, Jean-Paul; Juhel, Kévin; Barsuglia, Matteo; Ampuero, Jean Paul; Chassande-Mottin, Eric; Harms, Jan; Whiting, Bernard; Bernard, Pascal; Clévédé, Eric; Lognonné, Philippe

    2016-11-22

    Transient gravity changes are expected to occur at all distances during an earthquake rupture, even before the arrival of seismic waves. Here we report on the search of such a prompt gravity signal in data recorded by a superconducting gravimeter and broadband seismometers during the 2011 Mw 9.0 Tohoku-Oki earthquake. During the earthquake rupture, a signal exceeding the background noise is observed with a statistical significance higher than 99% and an amplitude of a fraction of μGal, consistent in sign and order of magnitude with theoretical predictions from a first-order model. While prompt gravity signal detection with state-of-the-art gravimeters and seismometers is challenged by background seismic noise, its robust detection with gravity gradiometers under development could open new directions in earthquake seismology, and overcome fundamental limitations of current earthquake early-warning systems imposed by the propagation speed of seismic waves.

  9. Prompt gravity anomaly induced to the 2011Tohoku-Oki earthquake

    NASA Astrophysics Data System (ADS)

    Montagner, Jean-Paul; Juhel, Kevin; Barsuglia, Matteo; Ampuero, Jean-Paul; Harms, Jan; Chassande-Mottin, Eric; Whiting, Bernard; Bernard, Pascal; Clévédé, Eric; Lognonné, Philippe

    2017-04-01

    Transient gravity changes are expected to occur at all distances during an earthquake rupture, even before the arrival of seismic waves. Here we report on the search of such a prompt gravity signal in data recorded by a superconducting gravimeter and broadband seismometers during the 2011 Mw 9.0 Tohoku-Oki earthquake. During the earthquake rupture, a signal exceeding the background noise is observed with a statistical significance higher than 99% and an amplitude of a fraction of μGal, consistent in sign and order-of-magnitude with theoretical predictions from a first-order model. While prompt gravity signal detection with state-of-the-art gravimeters and seismometers is challenged by background seismic noise, its robust detection with gravity gradiometers under development could open new directions in earthquake seismology, and overcome fundamental limitations of current earthquake early-warning systems (EEWS) imposed by the propagation speed of seismic waves.

  10. Real Time Earthquake Information System in Japan

    NASA Astrophysics Data System (ADS)

    Doi, K.; Kato, T.

    2003-12-01

    monitors earthquake data and analyzes earthquake activities and tsunami occurrence round-the-clock on a real-time basis. In addition to the above, JMA has been developing a system of Nowcast Earthquake Information which can provide its users with occurrence of an earthquake prior to arrival of strong ground motion for a decade. Earthquake Research Institute, the University of Tokyo, is preparing a demonstrative experiment in collaboration with JMA, for a better utilization of Nowcast Earthquake Information to apply actual measures to reduce earthquake disasters caused by strong ground motion.

  11. A moment in time: emergency nurses and the Canterbury earthquakes.

    PubMed

    Richardson, S; Ardagh, M; Grainger, P; Robinson, V

    2013-06-01

    To outline the impact of the Canterbury, New Zealand (NZ) earthquakes on Christchurch Hospital, and the experiences of emergency nurses during this time. NZ has experienced earthquakes and aftershocks centred in the Canterbury region of the South Island. The location of these, around and within the major city of Christchurch, was unexpected and associated with previously unknown fault lines. While the highest magnitude quake occurred in September 2010, registering 7.1 on the Richter scale, it was the magnitude 6.3 event on 22 February 2011 which was associated with the greatest injury burden and loss of life. Staff working in the only emergency department in the city were faced with an external emergency while also being directly affected as part of the disaster. SOURCES OF EVIDENCE: This paper developed following interviews with nurses who worked during this period, and draws on literature related to healthcare responses to earthquakes and natural disasters. The establishment of an injury database allowed for an accurate picture to emerge of the injury burden, and each of the authors was present and worked in a clinical capacity during the earthquake. Nurses played a significant role in the response to the earthquakes and its aftermath. However, little is known regarding the impact of this, either in personal or professional terms. This paper presents an overview of the earthquakes and experiences of nurses working during this time, identifying a range of issues that will benefit from further exploration and research. It seeks to provide a sense of the experiences and the potential meanings that were derived from being part of this 'moment in time'. Examples of innovations in practice emerged during the earthquake response and a number of recommendations for nursing practice are identified. © 2013 The Authors. International Nursing Review © 2013 International Council of Nurses.

  12. Robust Satellite Techniques (RST) for monitoring earthquake prone areas by satellite TIR observations: The case of 1999 Chi-Chi earthquake (Taiwan)

    NASA Astrophysics Data System (ADS)

    Genzano, N.; Filizzola, C.; Paciello, R.; Pergola, N.; Tramutoli, V.

    2015-12-01

    For more than 13 years a multi-temporal data-analysis method, named Robust Satellite Techniques (RST), has been being applied to satellite Thermal InfraRed (TIR) monitoring of seismically active regions. It gives a clear definition of a TIR anomaly within a validation/confutation scheme devoted to verify if detected anomalies can be associated or not to the time and location of the occurrence of major earthquakes. In this scheme, the confutation part (i.e. verifying if similar anomalies do not occur in the absence of a significant seismic activity) assumes a role even much important than the usual validation component devoted to verify the presence of anomalous signal transients before (or in association with) specific seismic events. Since 2001, RST approach has been being used to study tens of earthquakes with a wide range of magnitudes (from 4.0 to 7.9) occurred in different continents and in various geo-tectonic settings. In this paper such a long term experience is exploited in order to give a quantitative definition of a significant sequence of TIR anomalies (SSTA) in terms of the required space-time continuity constraints (persistence), identifying also the different typologies of known spurious sequences of TIR anomalies that have to be excluded from the following validation steps. On the same basis, taking also into account for the physical models proposed for justifying the existence of a correlation between TIR anomalies and earthquakes occurrence, specific validation rules (in line with the ones used by the Collaboratory for the Study of Earthquake Predictability - CSEP - Project) have been defined to drive the validation process. In this work, such an approach is applied for the first time to a long-term dataset of night-time GMS-5/VISSR (Geostationary Meteorological Satellite/Visible and Infrared Spin-Scan Radiometer) TIR measurements, comparing SSTAs and earthquakes with M > 4 which occurred in a wide area around Taiwan, in the month of September of

  13. Global risk of big earthquakes has not recently increased.

    PubMed

    Shearer, Peter M; Stark, Philip B

    2012-01-17

    The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences--if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past.

  14. Global risk of big earthquakes has not recently increased

    PubMed Central

    Shearer, Peter M.; Stark, Philip B.

    2012-01-01

    The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences—if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past. PMID:22184228

  15. Next-Level ShakeZoning for Earthquake Hazard Definition in Nevada

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Savran, W. H.; Flinchum, B. A.; Dudley, C.; Prina, N.; Pullammanappallil, S.; Pancha, A.

    2011-12-01

    We are developing "Next-Level ShakeZoning" procedures tailored for defining earthquake hazards in Nevada. The current Federally sponsored tools- the USGS hazard maps and ShakeMap, and FEMA HAZUS- were developed as statistical summaries to match earthquake data from California, Japan, and Taiwan. The 2008 Wells and Mogul events in Nevada showed in particular that the generalized statistical approach taken by ShakeMap cannot match actual data on shaking from earthquakes in the Intermountain West, even to first order. Next-Level ShakeZoning relies on physics and geology to define earthquake shaking hazards, rather than statistics. It follows theoretical and computational developments made over the past 20 years, to capitalize on detailed and specific local data sets to more accurately model the propagation and amplification of earthquake waves through the multiple geologic basins of the Intermountain West. Excellent new data sets are now available for Las Vegas Valley. Clark County, Nevada has completed the nation's very first effort to map earthquake hazard class systematically through an entire urban area using Optim's SeisOpt° ReMi technique, which was adapted for large-scale data collection. Using the new Parcel Map in computing shaking in the Valley for scenario earthquakes is crucial for obtaining realistic predictions of ground motions. In an educational element of the project, a dozen undergraduate students have been computing 50 separate earthquake scenarios affecting Las Vegas Valley, using the Next-Level ShakeZoning process. Despite affecting only the upper 30 meters, the Vs30 geotechnical shear-velocity from the Parcel Map shows clear effects on 3-d shaking predictions computed so far at frequencies from 0.1 Hz up to 1.0 Hz. The effect of the Parcel Map on even the 0.1-Hz waves is prominent even with the large mismatch of wavelength to geotechnical depths. Amplifications and de-amplifications affected by the Parcel Map exceed a factor of two, and are

  16. Understanding dynamic friction through spontaneously evolving laboratory earthquakes

    PubMed Central

    Rubino, V.; Rosakis, A. J.; Lapusta, N.

    2017-01-01

    Friction plays a key role in how ruptures unzip faults in the Earth’s crust and release waves that cause destructive shaking. Yet dynamic friction evolution is one of the biggest uncertainties in earthquake science. Here we report on novel measurements of evolving local friction during spontaneously developing mini-earthquakes in the laboratory, enabled by our ultrahigh speed full-field imaging technique. The technique captures the evolution of displacements, velocities and stresses of dynamic ruptures, whose rupture speed range from sub-Rayleigh to supershear. The observed friction has complex evolution, featuring initial velocity strengthening followed by substantial velocity weakening. Our measurements are consistent with rate-and-state friction formulations supplemented with flash heating but not with widely used slip-weakening friction laws. This study develops a new approach for measuring local evolution of dynamic friction and has important implications for understanding earthquake hazard since laws governing frictional resistance of faults are vital ingredients in physically-based predictive models of the earthquake source. PMID:28660876

  17. Stress drops of induced and tectonic earthquakes in the central United States are indistinguishable

    PubMed Central

    Huang, Yihe; Ellsworth, William L.; Beroza, Gregory C.

    2017-01-01

    Induced earthquakes currently pose a significant hazard in the central United States, but there is considerable uncertainty about the severity of their ground motions. We measure stress drops of 39 moderate-magnitude induced and tectonic earthquakes in the central United States and eastern North America. Induced earthquakes, more than half of which are shallower than 5 km, show a comparable median stress drop to tectonic earthquakes in the central United States that are dominantly strike-slip but a lower median stress drop than that of tectonic earthquakes in the eastern North America that are dominantly reverse-faulting. This suggests that ground motion prediction equations developed for tectonic earthquakes can be applied to induced earthquakes if the effects of depth and faulting style are properly considered. Our observation leads to the notion that, similar to tectonic earthquakes, induced earthquakes are driven by tectonic stresses. PMID:28782040

  18. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  19. Scaling Relations of Earthquakes on Inland Active Mega-Fault Systems

    NASA Astrophysics Data System (ADS)

    Murotani, S.; Matsushima, S.; Azuma, T.; Irikura, K.; Kitagawa, S.

    2010-12-01

    Since 2005, The Headquarters for Earthquake Research Promotion (HERP) has been publishing 'National Seismic Hazard Maps for Japan' to provide useful information for disaster prevention countermeasures for the country and local public agencies, as well as promote public awareness of disaster prevention of earthquakes. In the course of making the year 2009 version of the map, which is the commemorate of the tenth anniversary of the settlement of the Comprehensive Basic Policy, the methods to evaluate magnitude of earthquakes, to predict strong ground motion, and to construct underground structure were investigated in the Earthquake Research Committee and its subcommittees. In order to predict the magnitude of earthquakes occurring on mega-fault systems, we examined the scaling relations for mega-fault systems using 11 earthquakes of which source processes were analyzed by waveform inversion and of which surface information was investigated. As a result, we found that the data fit in between the scaling relations of seismic moment and rupture area by Somerville et al. (1999) and Irikura and Miyake (2001). We also found that maximum displacement of surface rupture is two to three times larger than the average slip on the seismic fault and surface fault length is equal to length of the source fault. Furthermore, compiled data of the source fault shows that displacement saturates at 10m when fault length(L) is beyond 100km, L>100km. By assuming the fault width (W) to be 18km in average of inland earthquakes in Japan, and the displacement saturate at 10m for length of more than 100 km, we derived a new scaling relation between source area and seismic moment, S[km^2] = 1.0 x 10^-17 M0 [Nm] for mega-fault systems that seismic moment (M0) exceeds 1.8×10^20 Nm.

  20. Development of a borehole stress meter for studying earthquake predictions and rock mechanics, and stress seismograms of the 2011 Tohoku earthquake ( M 9.0)

    NASA Astrophysics Data System (ADS)

    Ishii, Hiroshi; Asai, Yasuhiro

    2015-02-01

    Although precursory signs of an earthquake can occur before the event, it is difficult to observe such signs with precision, especially on earth's surface where artificial noise and other factors complicate signal detection. One possible solution to this problem is to install monitoring instruments into the deep bedrock where earthquakes are likely to begin. When evaluating earthquake occurrence, it is necessary to elucidate the processes of stress accumulation in a medium and then release as a fault (crack) is generated, and to do so, the stress must be observed continuously. However, continuous observations of stress have not been implemented yet for earthquake monitoring programs. Strain is a secondary physical quantity whose variation varies depending on the elastic coefficient of the medium, and it can yield potentially valuable information as well. This article describes the development of a borehole stress meter that is capable of recording both continuous stress and strain at a depth of about 1 km. Specifically, this paper introduces the design principles of the stress meter as well as its actual structure. It also describes a newly developed calibration procedure and the results obtained to date for stress and strain studies of deep boreholes at three locations in Japan. To show examples of the observations, records of stress seismic waveforms generated by the 2011 Tohoku earthquake ( M 9.0) are presented. The results demonstrate that the stress meter data have sufficient precision and reliability.

  1. On the reported ionospheric precursor of the Hector Mine, California earthquake

    USGS Publications Warehouse

    Thomas, J.N.; Love, J.J.; Komjathy, A.; Verkhoglyadova, O.P.; Butala, M.; Rivera, N.

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identified as being anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  2. Challenges to communicate risks of human-caused earthquakes

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2014-12-01

    The awareness of natural hazards has been up-trending in recent years. In particular, this is true for earthquakes, which increase in frequency and magnitude in regions that normally do not experience seismic activity. In fact, one of the major concerns for many communities and businesses is that humans today seem to cause earthquakes due to large-scale shale gas production, dewatering and flooding of mines and deep geothermal power production. Accordingly, without opposing any of these technologies it should be a priority of earth scientists who are researching natural hazards to communicate earthquake risks. This presentation discusses the challenges that earth scientists are facing to properly communicate earthquake risks, in light of the fact that human-caused earthquakes are an environmental change affecting only some communities and businesses. Communication channels may range from research papers, books and class room lectures to outreach events and programs, popular media events or even social media networks.

  3. Geoelectric precursors to strong earthquakes in China

    NASA Astrophysics Data System (ADS)

    Yulin, Zhao; Fuye, Qian

    1994-05-01

    The main results of searching for electrical precursors to strong earthquakes in China for the last 25 yr are presented. This comprises: the continuous twenty-year resistivity record before and after the great Tangshan earthquake of 1976; spatial and temporal variations in resistivity anomalies observed at more than 6 stations within 150 km of the Tangshan earthquake epicenter; the travel-time curve for the front of the resistivity precursor; and a method of intersection for predicting the epicenter location. These results reveal a number of interesting facts: (1) Resistivity measurements with accuracies of 0.5% or better for over 20 yr show that resistivity decreases of several percent, which began approximately 3 yr prior to the Tangshan earthquake, were larger than the background fluctuations and hence statistically significant. An outstanding example of an intermediate-term resistivity precursor is given. (2) The intermediate-term resistivity precursor decrease before Tangshan earthquake is such a pervasive phenomenon that the mean decrease, in percent, can be contoured on a map of the Beijing-Tianjin-Tangshan region. This shows the maximum decrease centered over the epicenter. (3) The anomalies in resistivity and self-potential, which began 2-0.5 months before the Tangshan main shock, had periods equal to that of the tidal waves M 2 and MS f, respectively, so that the associated anomalies can be identified as impending-earthquake precursors and a modal related to stress-displacement weakening is proposed.

  4. Earthquake stress drop and laboratory-inferred interseismic strength recovery

    USGS Publications Warehouse

    Beeler, N.M.; Hickman, S.H.; Wong, T.-F.

    2001-01-01

    We determine the scaling relationships between earthquake stress drop and recurrence interval tr that are implied by laboratory-measured fault strength. We assume that repeating earthquakes can be simulated by stick-slip sliding using a spring and slider block model. Simulations with static/kinetic strength, time-dependent strength, and rate- and state-variable-dependent strength indicate that the relationship between loading velocity and recurrence interval can be adequately described by the power law VL ??? trn, where n=-1. Deviations from n=-1 arise from second order effects on strength, with n>-1 corresponding to apparent time-dependent strengthening and n<-1 corresponding to weakening. Simulations with rate and state-variable equations show that dynamic shear stress drop ????d scales with recurrence as d????d/dlntr ??? ??e(b-a), where ??e is the effective normal stress, ??=??/??e, and (a-b)=d??ss/dlnV is the steady-state slip rate dependence of strength. In addition, accounting for seismic energy radiation, we suggest that the static shear stress drop ????s scales as d????s/dlntr ??? ??e(1+??)(b-a), where ?? is the fractional overshoot. The variation of ????s with lntr for earthquake stress drops is somewhat larger than implied by room temperature laboratory values of ?? and b-a. However, the uncertainty associated with the seismic data is large and the discrepancy between the seismic observations and the rate of strengthening predicted by room temperature experiments is less than an order of magnitude. Copyright 2001 by the American Geophysical Union.

  5. Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S. A.; Spencer, B. D.

    2015-12-01

    The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.

  6. Earthquakes for Kids

    MedlinePlus

    ... across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History A scientist stands in ...

  7. Statistical distributions of earthquake numbers: consequence of branching process

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2010-03-01

    We discuss various statistical distributions of earthquake numbers. Previously, we derived several discrete distributions to describe earthquake numbers for the branching model of earthquake occurrence: these distributions are the Poisson, geometric, logarithmic and the negative binomial (NBD). The theoretical model is the `birth and immigration' population process. The first three distributions above can be considered special cases of the NBD. In particular, a point branching process along the magnitude (or log seismic moment) axis with independent events (immigrants) explains the magnitude/moment-frequency relation and the NBD of earthquake counts in large time/space windows, as well as the dependence of the NBD parameters on the magnitude threshold (magnitude of an earthquake catalogue completeness). We discuss applying these distributions, especially the NBD, to approximate event numbers in earthquake catalogues. There are many different representations of the NBD. Most can be traced either to the Pascal distribution or to the mixture of the Poisson distribution with the gamma law. We discuss advantages and drawbacks of both representations for statistical analysis of earthquake catalogues. We also consider applying the NBD to earthquake forecasts and describe the limits of the application for the given equations. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrence, the NBD has two parameters. The second parameter can be used to characterize clustering or overdispersion of a process. We determine the parameter values and their uncertainties for several local and global catalogues, and their subdivisions in various time intervals, magnitude thresholds, spatial windows, and tectonic categories. The theoretical model of how the clustering parameter depends on the corner (maximum) magnitude can be used to predict future earthquake number distribution in regions where very large earthquakes have not yet occurred.

  8. Two critical tests for the Critical Point earthquake

    NASA Astrophysics Data System (ADS)

    Tzanis, A.; Vallianatos, F.

    2003-04-01

    areas of negative stress transfer (stress shadows), i.e. the reverse effect which should be observed if energy was extracted from a fault system. In both cases the critical exponent of the accelerating sequence at the positive-stress-transfer regions is very close 0.25, consistent with the view of the fault network as a Self-Organizing Spinodal moving toward a first order phase transition. The reported observations are consistent with almost all of the theoretical predictions and expectations made in terms of the critical point / stress transfer model of seismogenesis. However, there are reservations as to whether they comprise bona-fide predictions. Time-to-failure modelling of accelerated seismicity is a relatively new field of study with few cases-histories whence to draw experience, most of which in fact comprise retrospective analyses of past earthquakes. Still, very little is known as to the development of real-time situations and their probability of success or failure. Also, the power-law scaling is essentially the result of a renormalisation, in which the process of failure at a small spatial scale and temporarily far from a global event can be remapped to the process of failure at a larger scale and closer to the global event. In consequence, when new elements are added, (i.e. large foreschocks), the sequence is renormalized and the predicted parameters may change, sometimes significantly. Yet another difficulty arises from the fact that even if a full-scale self-organising process is active in the critical area, it is not at all necessary that a large earthquake will occur as soon as the system enters the critical state. The critical point model merely predicts that past this time an earthquake is possible but not certain. The time of the large event may depend on several uncertain factors pertaining to the nucleation process, which may have significant time dependence of their own. Moreover, the stored energy may be dissipated with aseismic (low moment

  9. Characteristics of strong motions and damage implications of M S6.5 Ludian earthquake on August 3, 2014

    NASA Astrophysics Data System (ADS)

    Xu, Peibin; Wen, Ruizhi; Wang, Hongwei; Ji, Kun; Ren, Yefei

    2015-02-01

    The Ludian County of Yunnan Province in southwestern China was struck by an M S6.5 earthquake on August 3, 2014, which was another destructive event following the M S8.0 Wenchuan earthquake in 2008, M S7.1 Yushu earthquake in 2010, and M S7.0 Lushan earthquake in 2013. National Strong-Motion Observation Network System of China collected 74 strong motion recordings, which the maximum peak ground acceleration recorded by the 053LLT station in Longtoushan Town was 949 cm/s2 in E-W component. The observed PGAs and spectral ordinates were compared with ground-motion prediction equation in China and the NGA-West2 developed by Pacific Earthquake Engineering Researcher Center. This earthquake is considered as the first case for testing applicability of NGA-West2 in China. Results indicate that the observed PGAs and the 5 % damped pseudo-response spectral accelerations are significantly lower than the predicted ones. The field survey around some typical strong motion stations verified that the earthquake damage was consistent with the official isoseismal by China Earthquake Administration.

  10. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  11. A review on remotely sensed land surface temperature anomaly as an earthquake precursor

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Anshuman; Singh, Shaktiman; Sam, Lydia; Joshi, P. K.; Bhardwaj, Akanksha; Martín-Torres, F. Javier; Kumar, Rajesh

    2017-12-01

    The low predictability of earthquakes and the high uncertainty associated with their forecasts make earthquakes one of the worst natural calamities, capable of causing instant loss of life and property. Here, we discuss the studies reporting the observed anomalies in the satellite-derived Land Surface Temperature (LST) before an earthquake. We compile the conclusions of these studies and evaluate the use of remotely sensed LST anomalies as precursors of earthquakes. The arrival times and the amplitudes of the anomalies vary widely, thus making it difficult to consider them as universal markers to issue earthquake warnings. Based on the randomness in the observations of these precursors, we support employing a global-scale monitoring system to detect statistically robust anomalous geophysical signals prior to earthquakes before considering them as definite precursors.

  12. Experiences of municipal public health nurses following Japan's earthquake, tsunami, and nuclear disaster.

    PubMed

    Kayama, Mami; Akiyama, Tsuyoshi; Ohashi, Akiko; Horikoshi, Naoko; Kido, Yoshifumi; Murakata, Tazuko; Kawakami, Norito

    2014-01-01

    The purpose of this study was to explore the experiences of municipal public health nurses in the wake of the March 2011 massive earthquake and tsunami and resulting nuclear accident in Fukushima, Japan, from the time of the disaster until December 2013. Thirty-two public health nurses working in three cities in Fukushima prefecture were divided into four focus groups and took part in interviews, which were analyzed using a qualitative descriptive method. Two major themes were extracted: (1) experiences of difficulties and dilemmas, and (2) professional challenges and the meaning of excellence as a public health nurse. Subjects recounted their experiences based on the timeline of events. The process of overcoming various dilemmas--between prescribed roles and actual needs on the ground, being both civil servants and private citizens with families, and having to be publicly accountable while lacking adequate information--caused participants to reexamine the meaning of excellence in the practice of public health. The strenuous and complex demands of extended disaster management caused subjects to grow professionally. Helping them process their emotions should also help these nurses give focus to their posttraumatic growth, and strengthen their sense of professionalism. © 2014 Wiley Periodicals, Inc.

  13. Radon anomaly in soil gas as an earthquake precursor.

    PubMed

    Miklavcić, I; Radolić, V; Vuković, B; Poje, M; Varga, M; Stanić, D; Planinić, J

    2008-10-01

    The mechanical processes of earthquake preparation are always accompanied by deformations; afterwards, the complex short- or long-term precursory phenomena can appear. Anomalies of radon concentrations in soil gas are registered a few weeks or months before many earthquakes. Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors at site A (Osijek) during a 4-year period, as well as by the Barasol semiconductor detector at site B (Kasina) during 2 years. We investigated the influence of the meteorological parameters on the temporal radon variations, and we determined the equation of the multiple regression that enabled the reduction (deconvolution) of the radon variation caused by the barometric pressure, rainfall and temperature. The pre-earthquake radon anomalies at site A indicated 46% of the seismic events, on criterion M>or=3, R<200 km, and 21% at site B. Empirical equations between earthquake magnitude, epicenter distance and precursor time enabled estimation or prediction of an earthquake that will rise at the epicenter distance R from the monitoring site in expecting precursor time T.

  14. Did you feel it? Community-made earthquake shaking maps

    USGS Publications Warehouse

    Wald, D.J.; Wald, L.A.; Dewey, J.W.; Quitoriano, Vince; Adams, Elisabeth

    2001-01-01

    Since the early 1990's, the magnitude and location of an earthquake have been available within minutes on the Internet. Now, as a result of work by the U.S. Geological Survey (USGS) and with the cooperation of various regional seismic networks, people who experience an earthquake can go online and share information about its effects to help create a map of shaking intensities and damage. Such 'Community Internet Intensity Maps' (CIIM's) contribute greatly in quickly assessing the scope of an earthquake emergency, even in areas lacking seismic instruments.

  15. Recovering from the ShakeOut earthquake

    USGS Publications Warehouse

    Wein, Anne; Johnson, Laurie; Bernknopf, Richard

    2011-01-01

    Recovery from an earthquake like the M7.8 ShakeOut Scenario will be a major endeavor taking many years to complete. Hundreds of Southern California municipalities will be affected; most lack recovery plans or previous disaster experience. To support recovery planning this paper 1) extends the regional ShakeOut Scenario analysis into the recovery period using a recovery model, 2) localizes analyses to identify longer-term impacts and issues in two communities, and 3) considers the regional context of local recovery.Key community insights about preparing for post-disaster recovery include the need to: geographically diversify city procurement; set earthquake mitigation priorities for critical infrastructure (e.g., airport), plan to replace mobile homes with earthquake safety measures, consider post-earthquake redevelopment opportunities ahead of time, and develop post-disaster recovery management and governance structures. This work also showed that communities with minor damages are still sensitive to regional infrastructure damages and their potential long-term impacts on community recovery. This highlights the importance of community and infrastructure resilience strategies as well.

  16. What is the earthquake fracture energy?

    NASA Astrophysics Data System (ADS)

    Di Toro, G.; Nielsen, S. B.; Passelegue, F. X.; Spagnuolo, E.; Bistacchi, A.; Fondriest, M.; Murphy, S.; Aretusini, S.; Demurtas, M.

    2016-12-01

    The energy budget of an earthquake is one of the main open questions in earthquake physics. During seismic rupture propagation, the elastic strain energy stored in the rock volume that bounds the fault is converted into (1) gravitational work (relative movement of the wall rocks bounding the fault), (2) in- and off-fault damage of the fault zone rocks (due to rupture propagation and frictional sliding), (3) frictional heating and, of course, (4) seismic radiated energy. The difficulty in the budget determination arises from the measurement of some parameters (e.g., the temperature increase in the slipping zone which constraints the frictional heat), from the not well constrained size of the energy sinks (e.g., how large is the rock volume involved in off-fault damage?) and from the continuous exchange of energy from different sinks (for instance, fragmentation and grain size reduction may result from both the passage of the rupture front and frictional heating). Field geology studies, microstructural investigations, experiments and modelling may yield some hints. Here we discuss (1) the discrepancies arising from the comparison of the fracture energy measured in experiments reproducing seismic slip with the one estimated from seismic inversion for natural earthquakes and (2) the off-fault damage induced by the diffusion of frictional heat during simulated seismic slip in the laboratory. Our analysis suggests, for instance, that the so called earthquake fracture energy (1) is mainly frictional heat for small slips and (2), with increasing slip, is controlled by the geometrical complexity and other plastic processes occurring in the damage zone. As a consequence, because faults are rapidly and efficiently lubricated upon fast slip initiation, the dominant dissipation mechanism in large earthquakes may not be friction but be the off-fault damage due to fault segmentation and stress concentrations in a growing region around the fracture tip.

  17. A Modified Split Hopkinson Pressure Bar Approach for Mimicking Dynamic Oscillatory Stress Fluctuations During Earthquake Rupture

    NASA Astrophysics Data System (ADS)

    Braunagel, M. J.; Griffith, W. A.

    2017-12-01

    Past experimental work has demonstrated that rock failure at high strain rates occurs by fragmentation rather than discrete fracture and is accompanied by a dramatic increase in rock strength. However, these observations are difficult to reconcile with the assertion that pulverized rocks in fault zones are the product of impulsive stresses during the passage of earthquake ruptures, as the distance from the principal slip zones of some pulverized rock is too great to exceed fragmentation transition. One potential explanation to this paradox that has been suggested is that repeated loading over the course of multiple earthquake ruptures may gradually reduce the pulverization threshold, in terms of both strain rate and strength. We propose that oscillatory loading during a single earthquake rupture may further lower these pulverization thresholds, and that traditional dynamic experimental approaches, such as the Split Hopkinson Pressure Bar (SHPB) wherein load is applied as a single, smooth, sinusoidal compressive wave, may not reflect natural loading conditions. To investigate the effects of oscillatory compressive loading expected during earthquake rupture propagation, we develop a controlled cyclic loading model on a SHPB apparatus utilizing two striker bars connected by an elastic spring. Unlike traditional SHPB experiments that utilize a gas gun to fire a projectile bar and generate a single compressive wave on impact with the incident bar, our modified striker bar assembly oscillates while moving down the gun barrel and generates two separate compressive pulses separated by a lag time. By modeling the modified assembly as a mass-spring-mass assembly accelerating due to the force of the released gas, we can predict the compression time of the spring upon impact and therefore the time delay between the generation of the first and second compressive waves. This allows us to predictably control load cycles with durations of only a few hundred microseconds. Initial

  18. Constraints on behaviour of a mining‐induced earthquake inferred from laboratory rock mechanics experiments

    USGS Publications Warehouse

    McGarr, Arthur F.; Johnston, Malcolm J.; Boettcher, M.; Heesakkers, V.; Reches, Z.

    2013-01-01

    On December 12, 2004, an earthquake of magnitude 2.2, located in the TauTona Gold Mine at a depth of about 3.65 km in the ancient Pretorius fault zone, was recorded by the in-mine borehole seismic network, yielding an excellent set of ground motion data recorded at hypocentral distances of several km. From these data, the seismic moment tensor, indicating mostly normal faulting with a small implosive component, and the radiated energy were measured; the deviatoric component of the moment tensor was estimated to be M0 = 2.3×1012 N·m and the radiated energy ER = 5.4×108 J. This event caused extensive damage along tunnels within the Pretorius fault zone. What rendered this earthquake of particular interest was the underground investigation of the complex pattern of exposed rupture surfaces combined with laboratory testing of rock samples retrieved from the ancient fault zone (Heesakkers et al.2011a, 2011b). Event 12/12 2004 was the result of fault slip across at least four nonparallel fault surfaces; 25 mm of slip was measured at one location on the rupture segment that is most parallel with a fault plane inferred from the seismic moment tensor, suggesting that this segment accounted for much of the total seismic deformation. By applying a recently developed technique based on biaxial stick-slip friction experiments (McGarr2012, 2013) to the seismic results, together with the 25 mm slip observed underground, we estimated a maximum slip rate of at least 6.6 m/s, which is consistent with the observed damage to tunnels in the rupture zone. Similarly, the stress drop and apparent stress were found to be correspondingly high at 21.9 MPa and 6.6 MPa, respectively. The ambient state of stress, measured at the approximate depth of the earthquake but away from the influence of mining, in conjunction with laboratory measurements of the strength of the fault zone cataclasites, indicates that during rupture of the M 2.2 event, the normal stress acting on the large-slip fault

  19. Earthquake Hazard Assessment: Basics of Evaluation

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of

  20. Urban Policies and Earthquake Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Sarlo, Antonella

    2008-07-01

    The paper aims at proposing some considerations about some recent experiences of research carried out on the theme of earthquake risk mitigation and combining policies and actions of mitigation with urban development strategies. The objective was to go beyond the classical methodological approach aiming at defining a "technical" evaluation of the earthquake risk through a procedure which can correlate the three "components" of danger, exposure and vulnerability. These researches experiment, in terms of methodology and application, with a new category of interpretation and strategy: the so-called Struttura Urbana Minima (Minimum urban structure). Actually, the introduction of the Struttura Urbana Minima establishes a different approach towards the theme of safety in the field of earthquake risk, since it leads to a wider viewpoint, combining the building aspect of the issue with the purely urban one, involving not only town planning, but also social and managerial implications. In this sense the constituent logic of these researches is strengthened by two fundamental issues: - The social awareness of earthquake; - The inclusion of mitigation policies in the ordinary strategies for town and territory management. Three main aspects of the first point, that is of the "social awareness of earthquake", characterize this issue and demand to be considered within a prevention policy: - The central role of the risk as a social production, - The central role of the local community consent, - The central role of the local community capability to plan Therefore, consent, considered not only as acceptance, but above all as participation in the elaboration and implementation of choices, plays a crucial role in the wider issue of prevention policies. As far as the second point is concerned, the inclusion of preventive mitigation policies in ordinary strategies for the town and territory management demands the identification of criteria of choice and priorities of intervention and

  1. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.

  2. Development and use of a master health facility list: Haiti's experience during the 2010 earthquake response.

    PubMed

    Rose-Wood, Alyson; Heard, Nathan; Thermidor, Roody; Chan, Jessica; Joseph, Fanor; Lerebours, Gerald; Zugaldia, Antonio; Konkel, Kimberly; Edwards, Michael; Lang, Bill; Torres, Carmen-Rosa

    2014-08-01

    Master health facility lists (MHFLs) are gaining attention as a standards-based means to uniquely identify health facilities and to link facility-level data. The ability to reliably communicate information about specific health facilities can support an array of health system functions, such as routine reporting and emergency response operations. MHFLs support the alignment of donor-supported health information systems with county-owned systems. Recent World Health Organization draft guidance promotes the utility of MHFLs and outlines a process for list development and governance. Although the potential benefits of MHFLs are numerous and may seem obvious, there are few documented cases of MHFL construction and use. The international response to the 2010 Haiti earthquake provides an example of how governments, nongovernmental organizations, and others can collaborate within a framework of standards to build a more complete and accurate list of health facilities. Prior to the earthquake, the Haitian Ministry of Health (Ministère de la Santé Publique et de la Population [MSPP]) maintained a list of public-sector health facilities but lacked information on privately managed facilities. Following the earthquake, the MSPP worked with a multinational group to expand the completeness and accuracy of the list of health facilities, including information on post-quake operational status. This list later proved useful in the response to the cholera epidemic and is now incorporated into the MSPP's routine health information system. Haiti's experience demonstrates the utility of MHFL formation and use in crisis as well as in the routine function of the health information system.

  3. Development and use of a master health facility list: Haiti's experience during the 2010 earthquake response

    PubMed Central

    Rose-Wood, Alyson; Heard, Nathan; Thermidor, Roody; Chan, Jessica; Joseph, Fanor; Lerebours, Gerald; Zugaldia, Antonio; Konkel, Kimberly; Edwards, Michael; Lang, Bill; Torres, Carmen-Rosa

    2014-01-01

    ABSTRACT Master health facility lists (MHFLs) are gaining attention as a standards-based means to uniquely identify health facilities and to link facility-level data. The ability to reliably communicate information about specific health facilities can support an array of health system functions, such as routine reporting and emergency response operations. MHFLs support the alignment of donor-supported health information systems with county-owned systems. Recent World Health Organization draft guidance promotes the utility of MHFLs and outlines a process for list development and governance. Although the potential benefits of MHFLs are numerous and may seem obvious, there are few documented cases of MHFL construction and use. The international response to the 2010 Haiti earthquake provides an example of how governments, nongovernmental organizations, and others can collaborate within a framework of standards to build a more complete and accurate list of health facilities. Prior to the earthquake, the Haitian Ministry of Health (Ministère de la Santé Publique et de la Population [MSPP]) maintained a list of public-sector health facilities but lacked information on privately managed facilities. Following the earthquake, the MSPP worked with a multinational group to expand the completeness and accuracy of the list of health facilities, including information on post-quake operational status. This list later proved useful in the response to the cholera epidemic and is now incorporated into the MSPP's routine health information system. Haiti's experience demonstrates the utility of MHFL formation and use in crisis as well as in the routine function of the health information system. PMID:25276595

  4. IR spectral analysis for the diagnostics of crust earthquake precursors

    NASA Astrophysics Data System (ADS)

    Umarkhodgaev, R. M.; Liperovsky, V. A.; Mikhailin, V. V.; Meister, C.-V.; Naumov, D. Ju

    2012-04-01

    In regions of future earthquakes, a few days before the seismic shock, the emanation of radon and hydrogen is being observed, which causes clouds of increased ionisation in the atmosphere. In the present work the possible diagnostics of these clouds using infrared (IR) spectroscopy is considered, which may be important and useful for the general geophysical system of earthquake prediction and the observation of industrial emissions of radioactive materials into the atmosphere. Some possible physical processes are analysed, which cause, under the condition of additional ionisation in a pre-breakdown electrical field, emissions in the IR interval. In doing so, the transparency region of the IR spectrum at wavelengths of 7-15 μm is taken into account. This transparency region corresponds to spectral lines of small atmospheric constituents like CH4, CO2, N2O, NO2, NO, and O3. The possible intensities of the IR emissions observable in laboratories and in nature are estimated. The acceleration process of the electrons in the pre-breakdown electrical field before its adhesion to the molecules is analysed. The laboratory equipment for the investigation of the IR absorption spectrum is constructed for the cases of normal and decreased atmospheric pressures. The syntheses of ozone and nitrous oxides are performed in the barrier discharge. It is studied if the products of the syntheses may be used to model atmospheric processes where these components take part. Spectra of products of the syntheses in the wavelength region of 2-10 μm are observed and analysed. A device is created for the syntheses and accumulation of nitrous oxides. Experiments to observe the IR-spectra of ozone and nitrous oxides during the syntheses and during the further evolution of these molecules are performed. For the earthquake prediction, practically, the investigation of emission spectra is most important, but during the laboratory experiments, the radiation of the excited molecules is shifted by a

  5. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  6. "ABC's Earthquake" (Experiments and models in seismology)

    NASA Astrophysics Data System (ADS)

    Almeida, Ana

    2017-04-01

    Ana Almeida, Portugal Almeida, Ana Escola Básica e Secundária Dr. Vieira de Carvalho Moreira da Maia, Portugal The purpose of this presentation, in poster format, is to disclose an activity which was planned and made by me, in a school on the north of Portugal, using a kit of materials simple and easy to use - the sismo-box. The activity "ABC's Earthquake" was developed under the discipline of Natural Sciences, with students from 7th grade, geosciences teachers and other areas. The possibility of work with the sismo-box was seen as an exciting and promising opportunity to promote science, seismology more specifically, to do science, when using the existing models in the box and with them implement the scientific method, to work and consolidate content and skills in the area of Natural Sciences, to have a time of sharing these materials with classmates, and also with other teachers from the different areas. Throughout the development of the activity, either with students or teachers, it was possible to see the admiration by the models presented in the earthquake-box, as well as, the interest and the enthusiasm in wanting to move and understand what the results after the proposed procedure in the script. With this activity, we managed to promote: - educational success in this subject; a "school culture" with active participation, with quality, rules, discipline and citizenship values; fully integration of students with special educational needs; strengthen the performance of the school as a cultural, informational and formation institution; provide activities to date and innovative; foment knowledge "to be, being and doing" and contribute to a moment of joy and discovery.Learn by doing!

  7. Groundwater oxygen isotope anomaly before the M6.6 Tottori earthquake in Southwest Japan.

    PubMed

    Onda, Satoki; Sano, Yuji; Takahata, Naoto; Kagoshima, Takanori; Miyajima, Toshihiro; Shibata, Tomo; Pinti, Daniele L; Lan, Tefang; Kim, Nak Kyu; Kusakabe, Minoru; Nishio, Yoshiro

    2018-03-19

    Geochemical monitoring of groundwater in seismically-active regions has been carried out since 1970s. Precursors were well documented, but often criticized for anecdotal or fragmentary signals, and for lacking a clear physico-chemical explanation for these anomalies. Here we report - as potential seismic precursor - oxygen isotopic ratio anomalies of +0.24‰ relative to the local background measured in groundwater, a few months before the Tottori earthquake (M 6.6) in Southwest Japan. Samples were deep groundwater located 5 km west of the epicenter, packed in bottles and distributed as drinking water between September 2015 and July 2017, a time frame which covers the pre- and post-event. Small but substantial increase of 0.07‰ was observed soon after the earthquake. Laboratory crushing experiments of aquifer rock aimed to simulating rock deformation under strain and tensile stresses were carried out. Measured helium degassing from the rock and 18 O-shift suggest that the co-seismic oxygen anomalies are directly related to volumetric strain changes. The findings provide a plausible physico-chemical basis to explain geochemical anomalies in water and may be useful in future earthquake prediction research.

  8. Earthquakes drive focused denudation along a tectonically active mountain front

    NASA Astrophysics Data System (ADS)

    Li, Gen; West, A. Joshua; Densmore, Alexander L.; Jin, Zhangdong; Zhang, Fei; Wang, Jin; Clark, Marin; Hilton, Robert G.

    2017-08-01

    Earthquakes cause widespread landslides that can increase erosional fluxes observed over years to decades. However, the impact of earthquakes on denudation over the longer timescales relevant to orogenic evolution remains elusive. Here we assess erosion associated with earthquake-triggered landslides in the Longmen Shan range at the eastern margin of the Tibetan Plateau. We use the Mw 7.9 2008 Wenchuan and Mw 6.6 2013 Lushan earthquakes to evaluate how seismicity contributes to the erosional budget from short timescales (annual to decadal, as recorded by sediment fluxes) to long timescales (kyr to Myr, from cosmogenic nuclides and low temperature thermochronology). Over this wide range of timescales, the highest rates of denudation in the Longmen Shan coincide spatially with the region of most intense landsliding during the Wenchuan earthquake. Across sixteen gauged river catchments, sediment flux-derived denudation rates following the Wenchuan earthquake are closely correlated with seismic ground motion and the associated volume of Wenchuan-triggered landslides (r2 > 0.6), and to a lesser extent with the frequency of high intensity runoff events (r2 = 0.36). To assess whether earthquake-induced landsliding can contribute importantly to denudation over longer timescales, we model the total volume of landslides triggered by earthquakes of various magnitudes over multiple earthquake cycles. We combine models that predict the volumes of landslides triggered by earthquakes, calibrated against the Wenchuan and Lushan events, with an earthquake magnitude-frequency distribution. The long-term, landslide-sustained "seismic erosion rate" is similar in magnitude to regional long-term denudation rates (∼0.5-1 mm yr-1). The similar magnitude and spatial coincidence suggest that earthquake-triggered landslides are a primary mechanism of long-term denudation in the frontal Longmen Shan. We propose that the location and intensity of seismogenic faulting can contribute to

  9. Landslides and Earthquake Lakes from the Wenchuan, China Earthquake - Can it Happen in the U.S.?

    NASA Astrophysics Data System (ADS)

    Stenner, H.; Cydzik, K.; Hamilton, D.; Cattarossi, A.; Mathieson, E.

    2008-12-01

    The May 12, 2008 M7.9 Wenchuan, China earthquake destroyed five million homes and schools, causing over 87,650 deaths. Landslides, a secondary effect of the shaking, caused much of the devastation. Debris flows buried homes, rock falls crushed cars, and landslides dammed rivers. Blocked roads greatly impeded emergency access, delaying response. Our August 2008 field experience in the affected area reminded us that the western United States faces serious risks posed by earthquake-induced landslides. The topography of the western U.S. is less extreme than that near Wenchuan, but earthquakes may still cause devastating landslides, damming rivers and blocking access to affected areas. After the Wenchuan earthquake, lakes rapidly rose behind landslide dams, threatening millions of lives. One landslide above Beichuan City created Tangjiashan Lake, a massive body of water upstream of Mianyang, an area with 5.2 million people, 30,000 of whom were killed in the quake. Potential failure of the landslide dam put thousands more people at risk from catastrophic flooding. In 1959, the M7.4 Hebgen Lake earthquake in Montana caused a large landslide, which killed 19 people and dammed the Madison River. The Army Corps excavated sluices to keep the dam from failing catastrophically. The Hebgen Lake earthquake ultimately caused 28 deaths, mostly from landslides, but the affected region was sparsely populated. Slopes prone to strong earthquake shaking and landslides in California, Washington, and Oregon have much larger populations at risk. Landslide hazards continue after the earthquake due to the effect strong shaking has on hillslopes, particularly when subjected to subsequent rain. These hazards must be taken into account. Once a landslide blocks a river, rapid and thoughtful action is needed. The Chinese government quickly and safely mitigated landslide dams that posed the greatest risk to people downstream. It took expert geotechnical advice, the speed and resources of the army

  10. A slow earthquake sequence on the San Andreas fault

    USGS Publications Warehouse

    Linde, A.T.; Gladwin, M.T.; Johnston, M.J.S.; Gwyther, R.L.; Bilham, R.G.

    1996-01-01

    EARTHQUAKES typically release stored strain energy on timescales of the order of seconds, limited by the velocity of sound in rock. Over the past 20 years, observations and laboratory experiments have indicated that capture can also occur more slowly, with durations up to hours. Such events may be important in earthquake nucleation and in accounting for the excess of plate convergence over seismic slip in subduction zones. The detection of events with larger timescales requires near-field deformation measurements. In December 1992, two borehole strainmeters close to the San Andreas fault in California recorded a slow strain event of about a week in duration, and we show here that the strain changes were produced by a slow earthquake sequence (equivalent magnitude 4.8) with complexity similar to that of regular earthquakes. The largest earthquakes associated with these slow events were small (local magnitude 3.7) and contributed negligible strain release. The importance of slow earthquakes in the seismogenic process remains an open question, but these observations extend the observed timescale for slow events by two orders of magnitude.

  11. Limiting the effects of earthquakes on gravitational-wave interferometers

    USGS Publications Warehouse

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Buchanan, Christopher; Coughlin, Eric; Donovan, Fred; Fee, Jeremy; Gabbard, Hunter; Guy, Michelle; Mukund, Nikhil; Perry, Matthew

    2017-01-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period.

  12. Limiting the effects of earthquakes on gravitational-wave interferometers

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Buchanan, Christopher; Coughlin, Eric; Donovan, Fred; Fee, Jeremy; Gabbard, Hunter; Guy, Michelle; Mukund, Nikhil; Perry, Matthew

    2017-02-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period.

  13. Analog earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

  14. Ionospheric anomalies detected by ionosonde and possibly related to crustal earthquakes in Greece

    NASA Astrophysics Data System (ADS)

    Perrone, Loredana; De Santis, Angelo; Abbattista, Cristoforo; Alfonsi, Lucilla; Amoruso, Leonardo; Carbone, Marianna; Cesaroni, Claudio; Cianchini, Gianfranco; De Franceschi, Giorgiana; De Santis, Anna; Di Giovambattista, Rita; Marchetti, Dedalo; Pavòn-Carrasco, Francisco J.; Piscini, Alessandro; Spogli, Luca; Santoro, Francesca

    2018-03-01

    Ionosonde data and crustal earthquakes with magnitude M ≥ 6.0 observed in Greece during the 2003-2015 period were examined to check if the relationships obtained earlier between precursory ionospheric anomalies and earthquakes in Japan and central Italy are also valid for Greek earthquakes. The ionospheric anomalies are identified on the observed variations of the sporadic E-layer parameters (h'Es, foEs) and foF2 at the ionospheric station of Athens. The corresponding empirical relationships between the seismo-ionospheric disturbances and the earthquake magnitude and the epicentral distance are obtained and found to be similar to those previously published for other case studies. The large lead times found for the ionospheric anomalies occurrence may confirm a rather long earthquake preparation period. The possibility of using the relationships obtained for earthquake prediction is finally discussed.

  15. Predictive Feedback and Conscious Visual Experience

    PubMed Central

    Panichello, Matthew F.; Cheung, Olivia S.; Bar, Moshe

    2012-01-01

    The human brain continuously generates predictions about the environment based on learned regularities in the world. These predictions actively and efficiently facilitate the interpretation of incoming sensory information. We review evidence that, as a result of this facilitation, predictions directly influence conscious experience. Specifically, we propose that predictions enable rapid generation of conscious percepts and bias the contents of awareness in situations of uncertainty. The possible neural mechanisms underlying this facilitation are discussed. PMID:23346068

  16. From Data-Sharing to Model-Sharing: SCEC and the Development of Earthquake System Science (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2009-12-01

    Earthquake system science seeks to construct system-level models of earthquake phenomena and use them to predict emergent seismic behavior—an ambitious enterprise that requires high degree of interdisciplinary, multi-institutional collaboration. This presentation will explore model-sharing structures that have been successful in promoting earthquake system science within the Southern California Earthquake Center (SCEC). These include disciplinary working groups to aggregate data into community models; numerical-simulation working groups to investigate system-specific phenomena (process modeling) and further improve the data models (inverse modeling); and interdisciplinary working groups to synthesize predictive system-level models. SCEC has developed a cyberinfrastructure, called the Community Modeling Environment, that can distribute the community models; manage large suites of numerical simulations; vertically integrate the hardware, software, and wetware needed for system-level modeling; and promote the interactions among working groups needed for model validation and refinement. Various socio-scientific structures contribute to successful model-sharing. Two of the most important are “communities of trust” and collaborations between government and academic scientists on mission-oriented objectives. The latter include improvements of earthquake forecasts and seismic hazard models and the use of earthquake scenarios in promoting public awareness and disaster management.

  17. Modeling the behavior of an earthquake base-isolated building.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coveney, V. A.; Jamil, S.; Johnson, D. E.

    1997-11-26

    Protecting a structure against earthquake excitation by supporting it on laminated elastomeric bearings has become a widely accepted practice. The ability to perform accurate simulation of the system, including FEA of the bearings, would be desirable--especially for key installations. In this paper attempts to model the behavior of elastomeric earthquake bearings are outlined. Attention is focused on modeling highly-filled, low-modulus, high-damping elastomeric isolator systems; comparisons are made between standard triboelastic solid model predictions and test results.

  18. Citizen Seismology Provides Insights into Ground Motions and Hazard from Injection-Induced Earthquakes

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2014-12-01

    The US Geological Survey "Did You Feel It?" (DYFI) system is a highly successful example of citizen seismology. Users around the world now routinely report felt earthquakes via the Web; this information is used to determine Community Decimal Intensity values. These data can be enormously valuable for helping address a key issue that has arisen recently: quantifying the shaking/hazard associated with injection-induced earthquakes. I consider the shaking from 11 moderate (Mw3.9-5.7) earthquakes in the central and eastern United States that are believed to be induced by fluid injection. The distance decay of intensities for all events is consistent with that observed for regional tectonic earthquakes, but for all of the events intensities are lower than values predicted from an intensity prediction equation derived using data from tectonic events. I introduce an effective intensity magnitude, MIE, defined as the magnitude that on average would generate a given intensity distribution. For all 11 events, MIE is lower than the event magnitude by 0.4-1.3 units, with an average difference of 0.8 units. This suggests that stress drops of injection-induced earthquakes are lower than tectonic earthquakes by a factor of 2-10. However, relatively limited data suggest that intensities for epicentral distances less than 10 km are more commensurate with expectations for the event magnitude, which can be explained by the shallow focal depth of the events. The results suggest that damage from injection-induced earthquakes will be especially concentrated in the immediate epicentral region. These results further suggest a potential new discriminant for the identification of induced events. For ecample, while systematic analysis of California earthquakes remains to be done, DYFI data from the 2014 Mw5.1 La Habra, California, earthquake reveal no evidence for unusually low intensities, adding to a growing volume of evidence that this was a natural tectonic event.

  19. The ShakeOut earthquake source and ground motion simulations

    USGS Publications Warehouse

    Graves, R.W.; Houston, Douglas B.; Hudnut, K.W.

    2011-01-01

    The ShakeOut Scenario is premised upon the detailed description of a hypothetical Mw 7.8 earthquake on the southern San Andreas Fault and the associated simulated ground motions. The main features of the scenario, such as its endpoints, magnitude, and gross slip distribution, were defined through expert opinion and incorporated information from many previous studies. Slip at smaller length scales, rupture speed, and rise time were constrained using empirical relationships and experience gained from previous strong-motion modeling. Using this rupture description and a 3-D model of the crust, broadband ground motions were computed over a large region of Southern California. The largest simulated peak ground acceleration (PGA) and peak ground velocity (PGV) generally range from 0.5 to 1.0 g and 100 to 250 cm/s, respectively, with the waveforms exhibiting strong directivity and basin effects. Use of a slip-predictable model results in a high static stress drop event and produces ground motions somewhat higher than median level predictions from NGA ground motion prediction equations (GMPEs).

  20. Deep focus earthquakes in the laboratory

    NASA Astrophysics Data System (ADS)

    Schubnel, Alexandre; Brunet, Fabrice; Hilairet, Nadège; Gasc, Julien; Wang, Yanbin; Green, Harry W., II

    2014-05-01

    While the existence of deep earthquakes have been known since the 1920's, the essential mechanical process responsible for them is still poorly understood and remained one of the outstanding unsolved problems of geophysics and rock mechanics. Indeed, deep focus earthquake occur in an environment fundamentally different from that of shallow (<100 km) earthquakes. As pressure and temperature increase with depth however, intra-crystalline plasticity starts to dominate the deformation regime so that rocks yield by plastic flow rather than by brittle fracturing. Olivine phase transitions have provided an attractive alternative mechanism for deep focus earthquakes. Indeed, the Earth mantle transition zone (410-700km) is the locus of the two successive polymorphic transitions of olivine. Such scenario, however, runs into the conceptual barrier of initiating failure in a pressure (P) and temperature (T) regime where deviatoric stress relaxation is expected to be achieved through plastic flow. Here, we performed laboratory deformation experiments on Germanium olivine (Mg2GeO4) under differential stress at high pressure (P=2-5GPa) and within a narrow temperature range (T=1000-1250K). We find that fractures nucleate at the onset of the olivine to spinel transition. These fractures propagate dynamically (i.e. at a non-negligible fraction of the shear wave velocity) so that intense acoustic emissions are generated. Similar to deep-focus earthquakes, these acoustic emissions arise from pure shear sources, and obey the Gutenberg-Richter law without following Omori's law. Microstructural observations prove that dynamic weakening likely involves superplasticity of the nanocrystalline spinel reaction product at seismic strain rates. Although in our experiments the absolute stress value remains high compared to stresses expected within the cold core of subducted slabs, the observed stress drops are broadly consistent with those calculated for deep earthquakes. Constant differential

  1. Global observation of Omori-law decay in the rate of triggered earthquakes

    NASA Astrophysics Data System (ADS)

    Parsons, T.

    2001-12-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the 1999 shocks in Turkey and the 2001 events in El Salvador. In this study, earthquakes with M greater than 7.0 from the Harvard CMT catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near the main shocks are associated with calculated shear stress increases, while ~39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, triggered earthquakes obey an Omori-law rate decay that lasts between ~7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main-shock centroid. Earthquakes triggered by smaller quakes (foreshocks) also obey Omori's law, which is one of the few time-predictable patterns evident in the global occurrence of earthquakes. These observations indicate that earthquake probability calculations which include interactions from previous shocks should incorporate a transient Omori-law decay with time. In addition, a very simple model using the observed global rate change with time and spatial distribution of triggered earthquakes can be applied to immediately assess the likelihood of triggered earthquakes following large events, and can be in place until more sophisticated analyses are conducted.

  2. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  3. On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake

    USGS Publications Warehouse

    Thomas, Jeremy N.; Love, Jeffrey J.; Komjathy, Attila; Verkhoglyadova, Olga P.; Butala, Mark; Rivera, Nicholas

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identify as anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  4. Earthquake source properties from pseudotachylite

    USGS Publications Warehouse

    Beeler, Nicholas M.; Di Toro, Giulio; Nielsen, Stefan

    2016-01-01

    The motions radiated from an earthquake contain information that can be interpreted as displacements within the source and therefore related to stress drop. Except in a few notable cases, the source displacements can neither be easily related to the absolute stress level or fault strength, nor attributed to a particular physical mechanism. In contrast paleo-earthquakes recorded by exhumed pseudotachylite have a known dynamic mechanism whose properties constrain the co-seismic fault strength. Pseudotachylite can also be used to directly address a longstanding discrepancy between seismologically measured static stress drops, which are typically a few MPa, and much larger dynamic stress drops expected from thermal weakening during localized slip at seismic speeds in crystalline rock [Sibson, 1973; McKenzie and Brune, 1969; Lachenbruch, 1980; Mase and Smith, 1986; Rice, 2006] as have been observed recently in laboratory experiments at high slip rates [Di Toro et al., 2006a]. This note places pseudotachylite-derived estimates of fault strength and inferred stress levels within the context and broader bounds of naturally observed earthquake source parameters: apparent stress, stress drop, and overshoot, including consideration of roughness of the fault surface, off-fault damage, fracture energy, and the 'strength excess'. The analysis, which assumes stress drop is related to corner frequency by the Madariaga [1976] source model, is restricted to the intermediate sized earthquakes of the Gole Larghe fault zone in the Italian Alps where the dynamic shear strength is well-constrained by field and laboratory measurements. We find that radiated energy exceeds the shear-generated heat and that the maximum strength excess is ~16 MPa. More generally these events have inferred earthquake source parameters that are rate, for instance a few percent of the global earthquake population has stress drops as large, unless: fracture energy is routinely greater than existing models allow

  5. U.S. Geological Survey (USGS) Earthquake Web Applications

    NASA Astrophysics Data System (ADS)

    Fee, J.; Martinez, E.

    2015-12-01

    USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/

  6. Izmit, Turkey 1999 Earthquake Interferogram

    NASA Image and Video Library

    2001-03-30

    This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul. http://photojournal.jpl.nasa.gov/catalog/PIA00557

  7. Limiting the Effects of Earthquake Shaking on Gravitational-Wave Interferometers

    NASA Astrophysics Data System (ADS)

    Perry, M. R.; Earle, P. S.; Guy, M. R.; Harms, J.; Coughlin, M.; Biscans, S.; Buchanan, C.; Coughlin, E.; Fee, J.; Mukund, N.

    2016-12-01

    Second-generation ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to high-amplitude waves from teleseismic events, which can cause astronomical detectors to fall out of mechanical lock (lockloss). This causes the data to be useless for gravitational wave detection around the time of the seismic arrivals and for several hours thereafter while the detector stabilizes enough to return to the locked state. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining lock even at the expense of increased instrumental noise. Here we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Hypocenter and magnitude information is typically available within 5 to 20 minutes of the origin time of significant earthquakes, generally before the arrival of high-amplitude waves from these teleseisms at LIGO. These alerts are used to estimate arrival times and ground velocities at the gravitational wave detectors. In general, 94% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal with about 90% of the events falling within a factor of 2 of the final predicted value. By using a Machine Learning Algorithm, we develop a lockloss prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could save lockloss from 40-100 earthquake events in a 6-month time-period.

  8. Development of an Earthquake Impact Scale

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Marano, K. D.; Jaiswal, K. S.

    2009-12-01

    With the advent of the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, domestic (U.S.) and international earthquake responders are reconsidering their automatic alert and activation levels as well as their response procedures. To help facilitate rapid and proportionate earthquake response, we propose and describe an Earthquake Impact Scale (EIS) founded on two alerting criteria. One, based on the estimated cost of damage, is most suitable for domestic events; the other, based on estimated ranges of fatalities, is more appropriate for most global events. Simple thresholds, derived from the systematic analysis of past earthquake impact and response levels, turn out to be quite effective in communicating predicted impact and response level of an event, characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses exceeding 1M, 10M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness dominate in countries where vernacular building practices typically lend themselves to high collapse and casualty rates, and it is these impacts that set prioritization for international response. In contrast, it is often financial and overall societal impacts that trigger the level of response in regions or countries where prevalent earthquake resistant construction practices greatly reduce building collapse and associated fatalities. Any newly devised alert protocols, whether financial or casualty based, must be intuitive and consistent with established lexicons and procedures. In this analysis, we make an attempt

  9. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    NASA Astrophysics Data System (ADS)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  10. Analysis of the Earthquake Impact towards water-based fire extinguishing system

    NASA Astrophysics Data System (ADS)

    Lee, J.; Hur, M.; Lee, K.

    2015-09-01

    Recently, extinguishing system installed in the building when the earthquake occurred at a separate performance requirements. Before the building collapsed during the earthquake, as a function to maintain a fire extinguishing. In particular, the automatic sprinkler fire extinguishing equipment, such as after a massive earthquake without damage to piping also must maintain confidentiality. In this study, an experiment installed in the building during the earthquake, the water-based fire extinguishing saw grasp the impact of the pipe. Experimental structures for water-based fire extinguishing seismic construction step by step, and then applied to the seismic experiment, the building appears in the extinguishing of the earthquake response of the pipe was measured. Construction of acceleration caused by vibration being added to the size and the size of the displacement is measured and compared with the data response of the pipe from the table, thereby extinguishing water piping need to enhance the seismic analysis. Define the seismic design category (SDC) for the four groups in the building structure with seismic criteria (KBC2009) designed according to the importance of the group and earthquake seismic intensity. The event of a real earthquake seismic analysis of Category A and Category B for the seismic design of buildings, the current fire-fighting facilities could have also determined that the seismic performance. In the case of seismic design categories C and D are installed in buildings to preserve the function of extinguishing the required level of seismic retrofit design is determined.

  11. Earthquake Safety Tips in the Classroom

    NASA Astrophysics Data System (ADS)

    Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M.

    2014-12-01

    The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in

  12. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  13. Relocation of earthquakes at southwestern Indian Ocean Ridge and its tectonic significance

    NASA Astrophysics Data System (ADS)

    Luo, W.; Zhao, M.; Haridhi, H.; Lee, C. S.; Qiu, X.; Zhang, J.

    2015-12-01

    The southwest Indian Ridge (SWIR) is a typical ultra-slow spreading ridge (Dick et al., 2003) and further plate boundary where the earthquakes often occurred. Due to the lack of the seismic stations in SWIR, positioning of earthquakes and micro-earthquakes is not accurate. The Ocean Bottom Seismometers (OBS) seismic experiment was carried out for the first time in the SWIR 49 ° 39 'E from Jan. to March, 2010 (Zhao et al., 2013). These deployed OBS also recorded the earthquakes' waveforms during the experiment. Two earthquakes occurred respectively in Feb. 7 and Feb. 9, 2010 with the same magnitude of 4.4 mb. These two earthquakes were relocated using the software HYPOSAT based on the spectrum analysis and band-pass (3-5 Hz) filtering and picking up the travel-times of Pn and Sn. Results of hypocentral determinations show that there location error is decreased significantly by joined OBS's recording data. This study do not only provide the experiences for the next step deploying long-term wide-band OBSs, but also deepen understanding of the structure of SWIR and clarify the nature of plate tectonic motivation. This research was granted by the Natural Science Foundation of China (41176053, 91028002, 91428204). Keywords: southwest Indian Ridge (SWIR), relocation of earthquakes, Ocean Bottom Seismometers (OBS), HYPOSAT References:[1] Dick, H. J. B., Lin J., Schouten H. 2003. An ultraslow-spreading class of ocean ridge. Nature, 426(6965): 405-412. [2] Zhao M. H., et al. 2013. Three-dimensional seismic structure of the Dragon Flag oceanic core complex at the ultraslow spreading Southwest Indian Ridge (49°39' E). Geochemistry Geophysics Geosystems, 14(10): 4544-4563.

  14. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction

  15. Earthquake clustering in modern seismicity and its relationship with strong historical earthquakes around Beijing, China

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Main, Ian G.; Musson, Roger M. W.

    2017-11-01

    Beijing, China's capital city, is located in a typical intraplate seismic belt, with relatively high-quality instrumental catalogue data available since 1970. The Chinese historical earthquake catalogue contains six strong historical earthquakes of Ms ≥ 6 around Beijing, the earliest in 294 AD. This poses a significant potential hazard to one of the most densely populated and economically active parts of China. In some intraplate areas, persistent clusters of events associated with historical events can occur over centuries, for example, the ongoing sequence in the New Madrid zone of the eastern US. Here we will examine the evidence for such persistent clusters around Beijing. We introduce a metric known as the `seismic density index' that quantifies the degree of clustering of seismic energy release. For a given map location, this multi-dimensional index depends on the number of events, their magnitudes, and the distances to the locations of the surrounding population of earthquakes. We apply the index to modern instrumental catalogue data between 1970 and 2014, and identify six clear candidate zones. We then compare these locations to earthquake epicentre and seismic intensity data for the six largest historical earthquakes. Each candidate zone contains one of the six historical events, and the location of peak intensity is within 5 km or so of the reported epicentre in five of these cases. In one case—the great Ms 8 earthquake of 1679—the peak is closer to the area of strongest shaking (Intensity XI or more) than the reported epicentre. The present-day event rates are similar to those predicted by the modified Omori law but there is no evidence of ongoing decay in event rates. Accordingly, the index is more likely to be picking out the location of persistent weaknesses in the lithosphere. Our results imply zones of high seismic density index could be used in principle to indicate the location of unrecorded historical of palaeoseismic events, in China and

  16. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes

    NASA Astrophysics Data System (ADS)

    Williams, Jack G.; Rosser, Nick J.; Kincey, Mark E.; Benjamin, Jessica; Oven, Katie J.; Densmore, Alexander L.; Milledge, David G.; Robinson, Tom R.; Jordan, Colm A.; Dijkstra, Tom A.

    2018-01-01

    Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days) humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM) can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  17. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence

  18. A Science Teacher Experience in the Sumatra Earthquake and Tsunami Offshore Survey Expedition of May 2005

    NASA Astrophysics Data System (ADS)

    Moran, K.; Holt, S.; Grilli, S.

    2005-12-01

    Through the NSF-funded ARMADA Project, K-12 teachers can participate in scientific expeditions to gain a first-hand, and usually exciting, research experience. ARMADA Master Teachers decode this research opportunity that includes data collection and experimentation, into methodology development, and technology for use in their classrooms. Their experiences have broader impact because each teacher mentors other teachers in their school district and directly participates in the National Science Teachers Association Annual Convention to share the knowledge to an even broader educational audience. A science teacher, Susan Holt (from Arcadia High School in Phoenix, Arizona) participated as part of an international scientific party on a recent cruise to study the seafloor in the area of the December 26th Great Sumatra earthquake and tsunami-the Sumatra Earthquake And Tsunami Offshore Survey (SEATOS). She participated in all aspects of the expedition: geophysical surveys, Remotely Operated Vehicle (ROV) "watch", sample preparation and recovery, science planning and review meetings, and by interacting with the expert ship's crew. Susan posted reports regularly on a website and prepared a daily log that that was useful not only for her students, but also for other teachers in the Scottsdale Unified School District in Arizona and the Montgomery County School District in Tennessee, science team members' families, friends, and local press. Overall, the experience benefited all parties: the teacher by learning and experiencing a shipboard geophysical operation; the scientists by Susan's fresh perspective that encouraged everyone to re-examine their first assumptions and interpretations; the SEATOS expedition by Susan's assistance in science operations; and the shipboard environment where she was able to break down the typical artificial barriers between the science `crew' and the ship's crew through frank and open dialogue. We present a summary of the SEATOS expedition, the

  19. Two components of postseismic gravity changes of megathrust earthquakes from satellite gravimetry

    NASA Astrophysics Data System (ADS)

    Tanaka, Y.; Heki, K.

    2013-12-01

    There are several reports of the observations of gravity changes due to megathrust earthquakes with data set of Gravity Recovery And Climate Experiment (GRACE) satellite. We analyzed the co- and postseismic gravity changes of the three magnitude 9 class earthquakes, the 2004 Sumatra-Andaman, the 2010 Chile (Maule), and the 2011 Tohoku-Oki earthquakes, using the newly released data (Release 05 data) set. In addition to the coseismic steps, these earthquakes showed a common feature that the postseismic changes include two components with different polarity and time constants, i.e. rapid decreases over a few months, followed by slow increases lasting for years. This is shown in the auxiliary figure of this abstract. In this figure, the white circles are the data whose seasonal and secular changes were removed. The vertical translucent lines denote the earthquake occurrence times. All the three earthquakes suggest the existence of two postseismic gravity change components with two distinct time constants. The first (short-term) component showed geographical distribution similar to the coseismic changes, but the position of the largest gravity decrease shifted toward the trench. The short-term components can be related to afterslip, but their time constants and distributions showed significant deviation from gravity changes predicted by the afterslip models. The second (long-term) components are characterized by positive gravity changes with the peak close to the trench axis. The long-term components should be attributed to different or multiple mechanisms, e.g. viscous relaxation of rocks in the upper mantle [Han and Simons, 2008; Panet et al., 2007] and diffusion of supercritical water around the down-dip end of the ruptured fault [Ogawa and Heki, 2007]. Both of the two mechanisms can explain the postseismic gravity increase in this timescale to some extent, but there have been no decisive evidence to prove or disprove either one of these. But generally speaking

  20. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  1. Source time function properties indicate a strain drop independent of earthquake depth and magnitude.

    PubMed

    Vallée, Martin

    2013-01-01

    The movement of tectonic plates leads to strain build-up in the Earth, which can be released during earthquakes when one side of a seismic fault suddenly slips with respect to the other. The amount of seismic strain release (or 'strain drop') is thus a direct measurement of a basic earthquake property, that is, the ratio of seismic slip over the dimension of the ruptured fault. Here the analysis of a new global catalogue, containing ~1,700 earthquakes with magnitude larger than 6, suggests that strain drop is independent of earthquake depth and magnitude. This invariance implies that deep earthquakes are even more similar to their shallow counterparts than previously thought, a puzzling finding as shallow and deep earthquakes are believed to originate from different physical mechanisms. More practically, this property contributes to our ability to predict the damaging waves generated by future earthquakes.

  2. Sensitivity analysis of tall buildings in Semarang, Indonesia due to fault earthquakes with maximum 7 Mw

    NASA Astrophysics Data System (ADS)

    Partono, Windu; Pardoyo, Bambang; Atmanto, Indrastono Dwi; Azizah, Lisa; Chintami, Rouli Dian

    2017-11-01

    Fault is one of the dangerous earthquake sources that can cause building failure. A lot of buildings were collapsed caused by Yogyakarta (2006) and Pidie (2016) fault source earthquakes with maximum magnitude 6.4 Mw. Following the research conducted by Team for Revision of Seismic Hazard Maps of Indonesia 2010 and 2016, Lasem, Demak and Semarang faults are three closest earthquake sources surrounding Semarang. The ground motion from those three earthquake sources should be taken into account for structural design and evaluation. Most of tall buildings, with minimum 40 meter high, in Semarang were designed and constructed following the 2002 and 2012 Indonesian Seismic Code. This paper presents the result of sensitivity analysis research with emphasis on the prediction of deformation and inter-story drift of existing tall building within the city against fault earthquakes. The analysis was performed by conducting dynamic structural analysis of 8 (eight) tall buildings using modified acceleration time histories. The modified acceleration time histories were calculated for three fault earthquakes with magnitude from 6 Mw to 7 Mw. The modified acceleration time histories were implemented due to inadequate time histories data caused by those three fault earthquakes. Sensitivity analysis of building against earthquake can be predicted by evaluating surface response spectra calculated using seismic code and surface response spectra calculated from acceleration time histories from a specific earthquake event. If surface response spectra calculated using seismic code is greater than surface response spectra calculated from acceleration time histories the structure will stable enough to resist the earthquake force.

  3. Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education in Turkey Part3

    NASA Astrophysics Data System (ADS)

    Kaneda, Yoshiyuki; Ozener, Haluk; Meral Ozel, Nurcan; Kalafat, Dogan; Ozgur Citak, Seckin; Takahashi, Narumi; Hori, Takane; Hori, Muneo; Sakamoto, Mayumi; Pinar, Ali; Oguz Ozel, Asim; Cevdet Yalciner, Ahmet; Tanircan, Gulum; Demirtas, Ahmet

    2017-04-01

    There have been many destructive earthquakes and tsunamis in the world.The recent events are, 2011 East Japan Earthquake/Tsunami in Japan, 2015 Nepal Earthquake and 2016 Kumamoto Earthquake in Japan, and so on. And very recently a destructive earthquake occurred in Central Italy. In Turkey, the 1999 Izmit Earthquake as the destructive earthquake occurred along the North Anatolian Fault (NAF). The NAF crosses the Sea of Marmara and the only "seismic gap" remains beneath the Sea of Marmara. Istanbul with high population similar to Tokyo in Japan, is located around the Sea of Marmara where fatal damages expected to be generated as compound damages including Tsunami and liquefaction, when the next destructive Marmara Earthquake occurs. The seismic risk of Istanbul seems to be under the similar risk condition as Tokyo in case of Nankai Trough earthquake and metropolitan earthquake. It was considered that Japanese and Turkish researchers can share their own experiences during past damaging earthquakes and can prepare for the future large earthquakes in cooperation with each other. Therefore, in 2013 the two countries, Japan and Turkey made an agreement to start a multidisciplinary research project, MarDiM SATREPS. The Project runs researches to aim to raise the preparedness for possible large-scale earthquake and Tsunami disasters in Marmara Region and it has four research groups with the following goals. 1) The first one is Marmara Earthquake Source region observational research group. This group has 4 sub-groups such as Seismicity, Geodesy, Electromagnetics and Trench analyses. Preliminary results such as seismicity and crustal deformation on the sea floor in Sea of Marmara have already achieved. 2) The second group focuses on scenario researches of earthquake occurrence along the North Anatolia Fault and precise tsunami simulation in the Marmara region. Research results from this group are to be the model of earthquake occurrence scenario in Sea of Marmara and the

  4. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  5. Finding positives after disaster: Insights from nurses following the 2010-2011 Canterbury, NZ earthquake sequence.

    PubMed

    Johal, Sarbjit S; Mounsey, Zoe R

    2015-11-01

    This paper identifies positive aspects of nurse experiences during the Canterbury 2010-2011 earthquake sequence and subsequent recovery process. Qualitative semi-structured interviews were undertaken with 11 nurses from the Christchurch area to explore the challenges faced by the nurses during and following the earthquakes. The interviews took place three years after the start of the earthquake experience to enable exploration of the longer term recovery process. The interview transcripts were analysed and coded using a grounded theory approach. The data analysis identified that despite the many challenges faced by the nurses during and following the earthquakes they were able to identify positives from their experience. A number of themes were identified that are related to posttraumatic growth, including; improvement in relationships with others, change in perspective/values, changed views of self and acknowledgement of the value of the experience. The research indicates that nurses were able to identify positive aspects of their experiences of the earthquakes and recovery process, suggesting that both positive and negative impacts on wellbeing can co-exist. These insights have value for employers designing support processes following disasters as focusing on positive elements could enhance nurse wellbeing during stressful times. Copyright © 2015 College of Emergency Nursing Australasia Ltd. Published by Elsevier Ltd. All rights reserved.

  6. Earthquake Signal Visible in GRACE Data

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure1

    This figure shows the effect of the December 2004 great Sumatra earthquake on the Earth's gravity field as observed by GRACE. The signal is expressed in terms of the relative acceleration of the two GRACE satellites, in this case a few nanometers per second squared, or about 1 billionth of the acceleration we experience everyday at the Earth's surface.GRACE observations show comparable signals in the region of the earthquake.

    Other natural variations are also apparent in the expected places, whereas no other significant change would be expected in the region of the earthquake

    GRACE, twin satellites launched in March 2002, are making detailed measurements of Earth's gravity field which will lead to discoveries about gravity and Earth's natural systems. These discoveries could have far-reaching benefits to society and the world's population.

  7. Ground-motion modeling of the 1906 San Francisco earthquake, part I: Validation using the 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Aagaard, Brad T.; Brocher, T.M.; Dolenc, D.; Dreger, D.; Graves, R.W.; Harmsen, S.; Hartzell, S.; Larsen, S.; Zoback, M.L.

    2008-01-01

    We compute ground motions for the Beroza (1991) and Wald et al. (1991) source models of the 1989 magnitude 6.9 Loma Prieta earthquake using four different wave-propagation codes and recently developed 3D geologic and seismic velocity models. In preparation for modeling the 1906 San Francisco earthquake, we use this well-recorded earthquake to characterize how well our ground-motion simulations reproduce the observed shaking intensities and amplitude and durations of recorded motions throughout the San Francisco Bay Area. All of the simulations generate ground motions consistent with the large-scale spatial variations in shaking associated with rupture directivity and the geologic structure. We attribute the small variations among the synthetics to the minimum shear-wave speed permitted in the simulations and how they accommodate topography. Our long-period simulations, on average, under predict shaking intensities by about one-half modified Mercalli intensity (MMI) units (25%-35% in peak velocity), while our broadband simulations, on average, under predict the shaking intensities by one-fourth MMI units (16% in peak velocity). Discrepancies with observations arise due to errors in the source models and geologic structure. The consistency in the synthetic waveforms across the wave-propagation codes for a given source model suggests the uncertainty in the source parameters tends to exceed the uncertainty in the seismic velocity structure. In agreement with earlier studies, we find that a source model with slip more evenly distributed northwest and southeast of the hypocenter would be preferable to both the Beroza and Wald source models. Although the new 3D seismic velocity model improves upon previous velocity models, we identify two areas needing improvement. Nevertheless, we find that the seismic velocity model and the wave-propagation codes are suitable for modeling the 1906 earthquake and scenario events in the San Francisco Bay Area.

  8. Conceptualizing ¬the Abstractions of Earthquakes Through an Instructional Sequence Using SeisMac and the Rapid Earthquake Viewer

    NASA Astrophysics Data System (ADS)

    Taber, J.; Hubenthal, M.; Wysession, M.

    2007-12-01

    Newsworthy earthquakes provide an engaging hook for students in Earth science classes, particularly when discussing their effects on people and the landscape. However, engaging students in an analysis of earthquakes that extends beyond death and damage, is frequently hampered by the abstraction of recorded ground motion data in the form of raw seismograms and the inability of most students to personally relate to ground accelerations. To overcome these challenges, an educational sequence has been developed using two software tools: SeisMac by Daniel Griscom, and the Rapid Earthquake Viewer (REV) developed by the University of South Carolina in collaboration with IRIS and DLESE. This sequence presents a unique opportunity for Earth Science teachers to "create" foundational experiences for students as they construction a framework of understanding of abstract concepts. The first activity is designed to introduce the concept of a three-component seismogram and to directly address the very abstract nature of seismograms through a kinesthetic experience. Students first learn to take the pulse of their classroom through a guided exploration of SeisMac, which displays the output of the laptop's built-in Sudden Motion Sensor (a 3-component accelerometer). This exploration allows students to view a 3-component seismogram as they move or tap the laptop and encourages them to propose and carry out experiments to explain the meaning of the 3-component seismogram. Once completed students are then asked to apply this new knowledge to a real 3-component seismogram printed from REV. Next the activity guides students through the process of identifying P and S waves and using SeisMac to connect the physical motion of the laptop to the "wiggles" they see on the SeisMac display and then comparing those to the "wiggles" they see on their seismogram. At this point students are more fully prepared to engage in an S-P location exercise such as those included in many state standards

  9. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  10. Can an earthquake prediction and warning system be developed?

    USGS Publications Warehouse

    N.N, Ambraseys

    1990-01-01

    Over the last 20 years, natural disasters have killed nearly 3 million people and disrupted the lives of over 800 million others. In 2 years there were more than 50 serious natural disasters, including landslides in Italy, France, and Colombia; a typhoon in Korea; wildfires in China and the United States; a windstorm in England; grasshopper plagues in Africa's horn and the Sahel; tornadoes in Canada; devastating earthquakes in Soviet Armenia and Tadzhikstand; infestations in Africa; landslides in Brazil; and tornadoes in the United States 

  11. Reducing process delays for real-time earthquake parameter estimation - An application of KD tree to large databases for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Yin, Lucy; Andrews, Jennifer; Heaton, Thomas

    2018-05-01

    Earthquake parameter estimations using nearest neighbor searching among a large database of observations can lead to reliable prediction results. However, in the real-time application of Earthquake Early Warning (EEW) systems, the accurate prediction using a large database is penalized by a significant delay in the processing time. We propose to use a multidimensional binary search tree (KD tree) data structure to organize large seismic databases to reduce the processing time in nearest neighbor search for predictions. We evaluated the performance of KD tree on the Gutenberg Algorithm, a database-searching algorithm for EEW. We constructed an offline test to predict peak ground motions using a database with feature sets of waveform filter-bank characteristics, and compare the results with the observed seismic parameters. We concluded that large database provides more accurate predictions of the ground motion information, such as peak ground acceleration, velocity, and displacement (PGA, PGV, PGD), than source parameters, such as hypocenter distance. Application of the KD tree search to organize the database reduced the average searching process by 85% time cost of the exhaustive method, allowing the method to be feasible for real-time implementation. The algorithm is straightforward and the results will reduce the overall time of warning delivery for EEW.

  12. Earthquake experience interference effects in a modified Stroop task: an ERP study.

    PubMed

    Wei, Dongtao; Qiu, Jiang; Tu, Shen; Tian, Fang; Su, Yanhua; Luo, Yuejia

    2010-05-03

    The effects of the modified Stroop task on ERP were investigated in 20 subjects who had experienced the Sichuan earthquake and a matched control group. ERP data showed that Incongruent stimuli elicited a more negative ERP deflection (N300-450) than did Congruent stimuli between 300 and 450 ms post-stimulus in the earthquake group but not found in the control group, and the N300-450 might reflect conflict monitor (the information of color and meaning do not match) in the early phase of perception identification due to their sensitivity to the external stimulus. Then, Incongruent stimuli elicited a more negative ERP deflection than did Congruent stimuli between 450 and 650 ms post-stimulus in both the groups. Dipole source analysis showed that the N450-650 was mainly generated in the ACC contributed to this effect in the control group, which might be related to monitor and conflict resolution. However, in the earthquake group, the N450-650 was generated in the thalamus, which might be involved in inhibiting and compensating of the ACC which may be related to conflict resolution process. 2010 Elsevier Ireland Ltd. All rights reserved.

  13. Faith after an Earthquake: A Longitudinal Study of Religion and Perceived Health before and after the 2011 Christchurch New Zealand Earthquake

    PubMed Central

    Sibley, Chris G.; Bulbulia, Joseph

    2012-01-01

    On 22 February 2011, Christchurch New Zealand (population 367,700) experienced a devastating earthquake, causing extensive damage and killing one hundred and eighty-five people. The earthquake and aftershocks occurred between the 2009 and 2011 waves of a longitudinal probability sample conducted in New Zealand, enabling us to examine how a natural disaster of this magnitude affected deeply held commitments and global ratings of personal health, depending on earthquake exposure. We first investigated whether the earthquake-affected were more likely to believe in God. Consistent with the Religious Comfort Hypothesis, religious faith increased among the earthquake-affected, despite an overall decline in religious faith elsewhere. This result offers the first population-level demonstration that secular people turn to religion at times of natural crisis. We then examined whether religious affiliation was associated with differences in subjective ratings of personal health. We found no evidence for superior buffering from having religious faith. Among those affected by the earthquake, however, a loss of faith was associated with significant subjective health declines. Those who lost faith elsewhere in the country did not experience similar health declines. Our findings suggest that religious conversion after a natural disaster is unlikely to improve subjective well-being, yet upholding faith might be an important step on the road to recovery. PMID:23227147

  14. Estimating earthquake magnitudes from reported intensities in the central and eastern United States

    USGS Publications Warehouse

    Boyd, Oliver; Cramer, Chris H.

    2014-01-01

    A new macroseismic intensity prediction equation is derived for the central and eastern United States and is used to estimate the magnitudes of the 1811–1812 New Madrid, Missouri, and 1886 Charleston, South Carolina, earthquakes. This work improves upon previous derivations of intensity prediction equations by including additional intensity data, correcting magnitudes in the intensity datasets to moment magnitude, and accounting for the spatial and temporal population distributions. The new relation leads to moment magnitude estimates for the New Madrid earthquakes that are toward the lower range of previous studies. Depending on the intensity dataset to which the new macroseismic intensity prediction equation is applied, mean estimates for the 16 December 1811, 23 January 1812, and 7 February 1812 mainshocks, and 16 December 1811 dawn aftershock range from 6.9 to 7.1, 6.8 to 7.1, 7.3 to 7.6, and 6.3 to 6.5, respectively. One‐sigma uncertainties on any given estimate could be as high as 0.3–0.4 magnitude units. We also estimate a magnitude of 6.9±0.3 for the 1886 Charleston, South Carolina, earthquake. We find a greater range of magnitude estimates when also accounting for multiple macroseismic intensity prediction equations. The inability to accurately and precisely ascertain magnitude from intensities increases the uncertainty of the central United States earthquake hazard by nearly a factor of two. Relative to the 2008 national seismic hazard maps, our range of possible 1811–1812 New Madrid earthquake magnitudes increases the coefficient of variation of seismic hazard estimates for Memphis, Tennessee, by 35%–42% for ground motions expected to be exceeded with a 2% probability in 50 years and by 27%–35% for ground motions expected to be exceeded with a 10% probability in 50 years.

  15. Prospective Evaluation of the Global Earthquake Activity Rate Model (GEAR1) Earthquake Forecast: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Strader, Anne; Schorlemmer, Danijel; Beutin, Thomas

    2017-04-01

    The Global Earthquake Activity Rate Model (GEAR1) is a hybrid seismicity model, constructed from a loglinear combination of smoothed seismicity from the Global Centroid Moment Tensor (CMT) earthquake catalog and geodetic strain rates (Global Strain Rate Map, version 2.1). For the 2005-2012 retrospective evaluation period, GEAR1 outperformed both parent strain rate and smoothed seismicity forecasts. Since 1. October 2015, GEAR1 has been prospectively evaluated by the Collaboratory for the Study of Earthquake Predictability (CSEP) testing center. Here, we present initial one-year test results of the GEAR1, GSRM and GSRM2.1, as well as localized evaluation of GEAR1 performance. The models were evaluated on the consistency in number (N-test), spatial (S-test) and magnitude (M-test) distribution of forecasted and observed earthquakes, as well as overall data consistency (CL-, L-tests). Performance at target earthquake locations was compared between models using the classical paired T-test and its non-parametric equivalent, the W-test, to determine if one model could be rejected in favor of another at the 0.05 significance level. For the evaluation period from 1. October 2015 to 1. October 2016, the GEAR1, GSRM and GSRM2.1 forecasts pass all CSEP likelihood tests. Comparative test results show statistically significant improvement of GEAR1 performance over both strain rate-based forecasts, both of which can be rejected in favor of GEAR1. Using point process residual analysis, we investigate the spatial distribution of differences in GEAR1, GSRM and GSRM2 model performance, to identify regions where the GEAR1 model should be adjusted, that could not be inferred from CSEP test results. Furthermore, we investigate whether the optimal combination of smoothed seismicity and strain rates remains stable over space and time.

  16. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  17. The preparatory phase of the April 6th 2009, Mw 6.3, L’Aquila earthquake: Seismological observations

    NASA Astrophysics Data System (ADS)

    Lucente, F. P.; de Gori, P.; Margheriti, L.; Piccinini, D.; Dibona, M.; Chiarabba, C.; Piana Agostinetti, N.

    2009-12-01

    Few decades ago, the dilatancy-diffusion hypothesis held great promise as a physical basis for developing earthquakes prediction techniques, but the potential never become reality, as the result of too few observations consistent with the theory. One of the main problems has been the lack of detailed monitoring records of small earthquakes swarms spatio-temporally close to the incoming major earthquakes. In fact, the recognition of dilatancy-related effects requires the use of very dense network of three-component seismographs, which, in turn, implies the a-priori knowledge of major earthquakes location, i.e., actually a paradox. The deterministic prediction of earthquakes remains a long time, hard task to accomplish. Nevertheless, for seismologists, the understanding of the processes that preside over the earthquakes nucleation and the mechanics of faulting represents a big step toward the ability to predict earthquakes. Here we describe a set of seismological observations done on the foreshock sequence that preceded the April 6th 2009, Mw 6.3, L’Aquila earthquake. In this occasion, the dense configuration of the seismic network in the area gave us the unique opportunity for a detailed reconstruction of the preparatory phase of the main shock. We show that measurable precursory effects, as changes of the seismic waves velocity and of the anisotropic parameters in the crust, occurred before the main shock. From our observations we infer that fluids play a key role in the fault failure process, and, most significantly, that the elastic properties of the rock volume surrounding the main shock nucleation area undergo a dramatic change about a week before the main shock occurrence.

  18. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  19. Rumours about the Po Valley earthquakes of 20th and 29th May 2012

    NASA Astrophysics Data System (ADS)

    La Longa, Federica; Crescimbene, Massimo; Camassi, Romano; Nostro, Concetta

    2013-04-01

    The history of rumours is as old as human history. Even in remote antiquity, rumours, gossip and hoax were always in circulation - in good or bad faith - to influence human affairs. Today with the development of mass media, rise of the internet and social networks, rumours are ubiquitous. The earthquakes, because of their characteristics of strong emotional impact and unpredictability, are among the natural events that more cause the birth and the spread of rumours. For this reason earthquakes that occurred in the Po valley the 20th and 29th May 2012 generated and still continue to generate a wide variety of rumours regarding issues related to the earthquake, its effects, the possible causes, future predictions. For this reason, as occurred during the L'Aquila earthquake sequence in 2009, following the events of May 2012 in Emilia Romagna was created a complex initiative training and information that at various stages between May and September 2012, involved population, partly present in the camp, and then the school staff of the municipalities affected by the earthquake. This experience has been organized and managed by the Department of Civil Protection (DPC), the National Institute of Geophysics and Volcanology (INGV), the Emilia Romagna region in collaboration with the Network of University Laboratories for Earthquake Engineering (RELUIS), the Health Service Emilia Romagna Regional and voluntary organizations of civil protection in the area. Within this initiative, in the period June-September 2012 were collected and catalogued over 240 rumours. In this work rumours of the Po Valley are studied in their specific characteristics and strategies and methods to fight them are also discussed. This work of collection and discussion of the rumours was particularly important to promote good communication strategies and to fight the spreading of the rumours. Only in this way it was possible to create a full intervention able to supporting both the local institutions and

  20. A smoothed stochastic earthquake rate model considering seismicity and fault moment release for Europe

    NASA Astrophysics Data System (ADS)

    Hiemer, S.; Woessner, J.; Basili, R.; Danciu, L.; Giardini, D.; Wiemer, S.

    2014-08-01

    We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project `Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for