#### Sample records for earthquake numbers consequence

1. Statistical distributions of earthquake numbers: consequence of branching process

NASA Astrophysics Data System (ADS)

Kagan, Yan Y.

2010-03-01

We discuss various statistical distributions of earthquake numbers. Previously, we derived several discrete distributions to describe earthquake numbers for the branching model of earthquake occurrence: these distributions are the Poisson, geometric, logarithmic and the negative binomial (NBD). The theoretical model is the `birth and immigration' population process. The first three distributions above can be considered special cases of the NBD. In particular, a point branching process along the magnitude (or log seismic moment) axis with independent events (immigrants) explains the magnitude/moment-frequency relation and the NBD of earthquake counts in large time/space windows, as well as the dependence of the NBD parameters on the magnitude threshold (magnitude of an earthquake catalogue completeness). We discuss applying these distributions, especially the NBD, to approximate event numbers in earthquake catalogues. There are many different representations of the NBD. Most can be traced either to the Pascal distribution or to the mixture of the Poisson distribution with the gamma law. We discuss advantages and drawbacks of both representations for statistical analysis of earthquake catalogues. We also consider applying the NBD to earthquake forecasts and describe the limits of the application for the given equations. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrence, the NBD has two parameters. The second parameter can be used to characterize clustering or overdispersion of a process. We determine the parameter values and their uncertainties for several local and global catalogues, and their subdivisions in various time intervals, magnitude thresholds, spatial windows, and tectonic categories. The theoretical model of how the clustering parameter depends on the corner (maximum) magnitude can be used to predict future earthquake number distribution in regions where very large earthquakes have not yet occurred.

2. Earthquake number forecasts testing

NASA Astrophysics Data System (ADS)

Kagan, Yan Y.

2017-10-01

We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

3. Extreme Magnitude Earthquakes and their Economical Consequences

NASA Astrophysics Data System (ADS)

Chavez, M.; Cabrera, E.; Ashworth, M.; Perea, N.; Emerson, D.; Salazar, A.; Moulinec, C.

2011-12-01

The frequency of occurrence of extreme magnitude earthquakes varies from tens to thousands of years, depending on the considered seismotectonic region of the world. However, the human and economic losses when their hypocenters are located in the neighborhood of heavily populated and/or industrialized regions, can be very large, as recently observed for the 1985 Mw 8.01 Michoacan, Mexico and the 2011 Mw 9 Tohoku, Japan, earthquakes. Herewith, a methodology is proposed in order to estimate the probability of exceedance of: the intensities of extreme magnitude earthquakes, PEI and of their direct economical consequences PEDEC. The PEI are obtained by using supercomputing facilities to generate samples of the 3D propagation of extreme earthquake plausible scenarios, and enlarge those samples by Monte Carlo simulation. The PEDEC are computed by using appropriate vulnerability functions combined with the scenario intensity samples, and Monte Carlo simulation. An example of the application of the methodology due to the potential occurrence of extreme Mw 8.5 subduction earthquakes on Mexico City is presented.

4. Rapid estimation of the economic consequences of global earthquakes

Jaiswal, Kishor; Wald, David J.

2011-01-01

The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

5. Earthquakes

MedlinePlus

... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

6. Knowledge base about earthquakes as a tool to minimize strong events consequences

NASA Astrophysics Data System (ADS)

Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

2017-04-01

The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

7. Economic consequences of earthquakes: bridging research and practice with HayWired

NASA Astrophysics Data System (ADS)

Wein, A. M.; Kroll, C.

2016-12-01

The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

8. Earthquakes

MedlinePlus

An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

9. Earthquakes.

ERIC Educational Resources Information Center

Pakiser, Louis C.

One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

10. Earthquakes.

ERIC Educational Resources Information Center

Walter, Edward J.

1977-01-01

Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

11. What caused a large number of fatalities in the Tohoku earthquake?

NASA Astrophysics Data System (ADS)

Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.

2012-04-01

The Mw9.0 earthquake caused 20,000 deaths and missing persons in northeastern Japan. 115 years prior to this event, there were three historical tsunamis that struck the region, one of which is a "tsunami earthquake" resulted with a death toll of 22,000. Since then, numerous breakwaters were constructed along the entire northeastern coasts and tsunami evacuation drills were carried out and hazard maps were distributed to local residents on numerous communities. However, despite the constructions and preparedness efforts, the March 11 Tohoku earthquake caused numerous fatalities. The strong shaking lasted three minutes or longer, thus all residents recognized that this is the strongest and longest earthquake that they had been ever experienced in their lives. The tsunami inundated an enormous area at about 560km2 over 35 cities along the coast of northeast Japan. To find out the reasons behind the high number of fatalities due to the March 11 tsunami, we interviewed 150 tsunami survivors at public evacuation shelters in 7 cities mainly in Iwate prefecture in mid-April and early June 2011. Interviews were done for about 30min or longer focused on their evacuation behaviors and those that they had observed. On the basis of the interviews, we found that residents' decisions not to evacuate immediately were partly due to or influenced by earthquake science results. Below are some of the factors that affected residents' decisions. 1. Earthquake hazard assessments turned out to be incorrect. Expected earthquake magnitudes and resultant hazards in northeastern Japan assessed and publicized by the government were significantly smaller than the actual Tohoku earthquake. 2. Many residents did not receive accurate tsunami warnings. The first tsunami warning were too small compared with the actual tsunami heights. 3. The previous frequent warnings with overestimated tsunami height influenced the behavior of the residents. 4. Many local residents above 55 years old experienced

12. The Long-Run Socio-Economic Consequences of a Large Disaster: The 1995 Earthquake in Kobe.

PubMed

duPont, William; Noy, Ilan; Okuyama, Yoko; Sawada, Yasuyuki

2015-01-01

We quantify the 'permanent' socio-economic impacts of the Great Hanshin-Awaji (Kobe) earthquake in 1995 by employing a large-scale panel dataset of 1,719 cities, towns, and wards from Japan over three decades. In order to estimate the counterfactual--i.e., the Kobe economy without the earthquake--we use the synthetic control method. Three important empirical patterns emerge: First, the population size and especially the average income level in Kobe have been lower than the counterfactual level without the earthquake for over fifteen years, indicating a permanent negative effect of the earthquake. Such a negative impact can be found especially in the central areas which are closer to the epicenter. Second, the surrounding areas experienced some positive permanent impacts in spite of short-run negative effects of the earthquake. Much of this is associated with movement of people to East Kobe, and consequent movement of jobs to the metropolitan center of Osaka, that is located immediately to the East of Kobe. Third, the furthest areas in the vicinity of Kobe seem to have been insulated from the large direct and indirect impacts of the earthquake.

13. Earthquake!

ERIC Educational Resources Information Center

Hernandez, Hildo

2000-01-01

Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

14. The Long-Run Socio-Economic Consequences of a Large Disaster: The 1995 Earthquake in Kobe

PubMed Central

duPont IV, William; Noy, Ilan; Okuyama, Yoko; Sawada, Yasuyuki

2015-01-01

We quantify the ‘permanent’ socio-economic impacts of the Great Hanshin-Awaji (Kobe) earthquake in 1995 by employing a large-scale panel dataset of 1,719 cities, towns, and wards from Japan over three decades. In order to estimate the counterfactual—i.e., the Kobe economy without the earthquake—we use the synthetic control method. Three important empirical patterns emerge: First, the population size and especially the average income level in Kobe have been lower than the counterfactual level without the earthquake for over fifteen years, indicating a permanent negative effect of the earthquake. Such a negative impact can be found especially in the central areas which are closer to the epicenter. Second, the surrounding areas experienced some positive permanent impacts in spite of short-run negative effects of the earthquake. Much of this is associated with movement of people to East Kobe, and consequent movement of jobs to the metropolitan center of Osaka, that is located immediately to the East of Kobe. Third, the furthest areas in the vicinity of Kobe seem to have been insulated from the large direct and indirect impacts of the earthquake. PMID:26426998

15. Extreme Subduction Earthquake Scenarios and their Economical Consequences for Mexico City and Guadalajara, Jalisco, Mexico

NASA Astrophysics Data System (ADS)

Chavez, M.; Cabrera, E.; Perea, N.

2007-05-01

The destructive effects of large magnitude, thrust subduction superficial (TSS) earthquakes on Mexico City (MC) and Guadalajara (G) has been shown in the recent centuries. For example, the 7/04/1845 a TSS earthquake with Ms 7+ and epicentral distance of about 250 km from MC occurred on the coast of the state of Guerrero, a Maximum Mercalli Modified Intensity (MMI) of IX-X was reported in MC. Furthermore, the 19/09/1985 a Ms 8.1, Mw 8.01, TSS earthquake with epicentral distance of about 340 km from MC occurred on the coast of the state of Michoacan, a maximum MMI of IX-X was reported in MC. Also, the largest, Ms 8.2, instrumentally observed TSS earthquake in Mexico, occurred in the Colima-Jalisco region the 3/06/1932, with epicentral distance of the order of 200 km from G in northwestern Mexico. The 9/10/1995 another similar event, Ms 7.4, Mw 8, with an epicentral distance of about 240 km from G, occurred in the same region and produced MMI IX in the epicentral zone and MMI up to VI in G. The frequency of occurrence of large TSS earthquakes in Mexico is poorly known, but it might vary from decades to centuries [1]. On the other hand, the first recordings of strong ground motions in MC dates from the early 1960´s and most of them were recorded after the 19/09/1985 earthquake. In G there is only one recording of the later event, and 13 for the one occurred the 9/10/1995 [2]. In order to fulfill the lack of strong ground motions records for large damaging TSS earthquakes, which could have an important economical impact on MC [3] and G, in this work we have modeled broadband synthetics (obtained with a hybrid model that has already been satisfactorily compared with observations of the 9/10/1995 Colima-Jalisco Mw 8 earthquake, [4]) expected in MC and G, associated to extreme magnitude Mw 8.5, TSS scenario earthquakes with epicenters in the so-called Guerrero gap and in the Colima-Jalisco zone, respectively. The proposed scenarios are based on the seismic history and up

16. Education and awareness regarding earthquakes and their consequences within the Cluj-Napoca SEISMOLAB, Romania

NASA Astrophysics Data System (ADS)

Brisan, Nicoleta; Stefanescu, Lucrina; Zaharia, Bogdan; Tataru, Dragos; Costin, Dan; Stefanie, Horatiu

2014-05-01

Education and awareness are efficient methods to mitigate the effects of natural disasters on communities. In this regard, the most receptive target group is the youth who have the potential to become vectors of information dissemination in their families and communities. In a country with significant seismic potential like Romania, the development of a Seismolab by means of an educational project is welcomed. The Seismolab operates within the Faculty of Environmental Science and Engineering at "Babeş-Bolyai" University, Cluj-Napoca, and it hosts activities conducted with the students of the faculty and pupils from Cluj and other schools involved in the RoEduSeis project. The RoEduSeis Project is a research and education project meant to develop the practical skills of primary, secondary and high school students in the field of Earth Sciences. A major objective of the project pursues the development and validation of new practical training methods for both teachers and students in the field of Earth Sciences. In this context, the Seismolab serves this particular aim by activities such as: training of students and teachers on conducting analyses and processing seismological data obtained from the educational seismographs in the Romanian educational seismic network; hands-on activities for pupils using educational resources developed through the project; documentary 2D and 3D movies and round tables on the topic of earthquakes and other natural risks. The students of the faculty use the data bases within subject matters in the curricula such as: Management of natural risks and disasters, Natural hazards and risks, Management of emergency situations etc. The seismometer used within the Seismolab will be connected to the above-mentioned educational network and the interaction between all the schools involved in the project will be conducted by the means of an e-learning platform. The results of this cooperation will contribute to a better education and awareness

17. Copy number variants implicate cardiac function and development pathways in earthquake-induced stress cardiomyopathy.

PubMed

Lacey, Cameron J; Doudney, Kit; Bridgman, Paul G; George, Peter M; Mulder, Roger T; Zarifeh, Julie J; Kimber, Bridget; Cadzow, Murray J; Black, Michael A; Merriman, Tony R; Lehnert, Klaus; Bickley, Vivienne M; Pearson, John F; Cameron, Vicky A; Kennedy, Martin A

2018-05-15

The pathophysiology of stress cardiomyopathy (SCM), also known as takotsubo syndrome, is poorly understood. SCM usually occurs sporadically, often in association with a stressful event, but clusters of cases are reported after major natural disasters. There is some evidence that this is a familial condition. We have examined three possible models for an underlying genetic predisposition to SCM. Our primary study cohort consists of 28 women who suffered SCM as a result of two devastating earthquakes that struck the city of Christchurch, New Zealand, in 2010 and 2011. To seek possible underlying genetic factors we carried out exome analysis, genotyping array analysis, and array comparative genomic hybridization on these subjects. The most striking finding was the observation of a markedly elevated rate of rare, heterogeneous copy number variants (CNV) of uncertain clinical significance (in 12/28 subjects). Several of these CNVs impacted on genes of cardiac relevance including RBFOX1, GPC5, KCNRG, CHODL, and GPBP1L1. There is no physical overlap between the CNVs, and the genes they impact do not appear to be functionally related. The recognition that SCM predisposition may be associated with a high rate of rare CNVs offers a novel perspective on this enigmatic condition.

18. Mitigating the consequences of future earthquakes in historical centres: what perspectives from the joined use of past information and geological-geophysical surveys?

NASA Astrophysics Data System (ADS)

Terenzio Gizzi, Fabrizio; Moscatelli, Massimiliano; Potenza, Maria Rosaria; Zotta, Cinzia; Simionato, Maurizio; Pileggi, Domenico; Castenetto, Sergio

2015-04-01

To mitigate the damage effects of earthquakes in urban areas and particularly in historical centres prone to high seismic hazard is an important task to be pursued. As a matter of fact, seismic history throughout the world informs us that earthquakes have caused deep changes in the ancient urban conglomerations due to their high building vulnerability. Furthermore, some quarters can be exposed to an increase of seismic actions if compared with adjacent areas due to the geological and/or topographical features of the site on which the historical centres lie. Usually, the strategies aimed to estimate the local seismic hazard make only use of the geological-geophysical surveys. Thorough this approach we do not draw any lesson from what happened as a consequences of past earthquakes. With this in mind, we present the results of a joined use of historical data and traditional geological-geophysical approach to analyse the effects of possible future earthquakes in historical centres. The research activity discussed here is arranged into a joint collaboration between the Department of Civil Protection of the Presidency of Council of Ministers, the Institute of Environmental Geology and Geoengineering and the Institute of Archaeological and Monumental Heritage of the National (Italian) Research Council. In order to show the results, we discuss the preliminary achievements of the integrated study carried out on two historical towns located in Southern Apennines, a portion of the Italian peninsula exposed to high seismic hazard. Taking advantage from these two test sites, we also discuss some methodological implications that could be taken as a reference in the seismic microzonation studies.

19. Earthquakes & Volcanoes, Volume 21, Number 1, 1989: Featuring the U.S. Geological Survey's National Earthquake Information Center in Golden, Colorado, USA

,; Spall, Henry; Schnabel, Diane C.

1989-01-01

Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers. The Secretary of the Interior has determined that the publication of this periodical is necessary in the transaction of the public business required by law of this Department. Use of funds for printing this periodical has been approved by the Office of Management and Budget through June 30, 1989. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

20. Analog earthquakes

SciT

Hofmann, R.B.

1995-09-01

Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

1. Earthquakes, November-December 1973

Person, W.J.

1974-01-01

Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria.

2. Materials Physics of Faults in Rapid Shear and Consequences for Earthquake Dynamics (Louis Néel Medal Lecture)

NASA Astrophysics Data System (ADS)

Rice, J. R.

2012-04-01

Field observations of maturely slipped faults show that despite a generally broad zone of damage by cracking and granulation, large shear deformation, and therefore heat generation, in individual earthquakes takes place with extreme localization to a zone of order 1 mm or less width within a finely granulated fault core. Relevant fault weakening processes during large crustal events are therefore likely to be thermally influenced, although a constraint to be met, from scarcity of pseudotachylite, is that melting within fault zones seems relatively rare, at least in the up per crust. Further, given the porosit y of damage zones, it seems reasonable to assume in-situ water presence. The lecture reviews current understanding of the materials physics underlying rapid shear of such fault zones, addressing questions like: Why is there severe localization? What are the dynamic relations between shear stress sustained by the fault and its slip history? How do those relations, taken to provide the boundary conditions on a rupturing interface between elastic regions of the earth, control key features of the dynamics of earthquakes? Primary dynamic weakening mechanisms, expected active in at least the early phases of nearly all crustal events, are flash heating at highly stressed frictional micro-contacts and thermal pressurization of native fault-zone pore fluid, the latter with a net effect that depends on interactions with dilatancy. Other weakening processes may also become active at large enough T rise, still prior to bulk melting, including endothermic decomposition reactions releasing a CO2 or H2O fluid phase under conditions that the fluid and solid products would, at the same p and T , occupy more volume than the parent rock, so that the pore fluid is forced to undergo severe pressure increase. The endothermic nature of the reactions buffers against melting because frictional work is absorbed into enthalpy increase of the reactants. There may also be a contribution

3. Estimation of species extinction: what are the consequences when total species number is unknown?

PubMed

Chen, Youhua

2014-12-01

The species-area relationship (SAR) is known to overestimate species extinction but the underlying mechanisms remain unclear to a great extent. Here, I show that when total species number in an area is unknown, the SAR model exaggerates the estimation of species extinction. It is proposed that to accurately estimate species extinction caused by habitat destruction, one of the principal prerequisites is to accurately total the species numbers presented in the whole study area. One can better evaluate and compare alternative theoretical SAR models on the accurate estimation of species loss only when the exact total species number for the whole area is clear. This presents an opportunity for ecologists to simulate more research on accurately estimating Whittaker's gamma diversity for the purpose of better predicting species loss.

4. Earthquakes and Schools

ERIC Educational Resources Information Center

National Clearinghouse for Educational Facilities, 2008

2008-01-01

Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

5. Earthquakes; January-February 1982

Person, W.J.

1982-01-01

In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine.

6. Consequences of high effective Prandtl number on solar differential rotation and convective velocity

NASA Astrophysics Data System (ADS)

Karak, Bidya Binay; Miesch, Mark; Bekki, Yuto

2018-04-01

Observations suggest that the large-scale convective velocities obtained by solar convection simulations might be over-estimated (convective conundrum). One plausible solution to this could be the small-scale dynamo which cannot be fully resolved by global simulations. The small-scale Lorentz force suppresses the convective motions and also the turbulent mixing of entropy between upflows and downflows, leading to a large effective Prandtl number (Pr). We explore this idea in three-dimensional global rotating convection simulations at different thermal conductivity (κ), i.e., at different Pr. In agreement with previous non-rotating simulations, the convective velocity is reduced with the increase of Pr as long as the thermal conductive flux is negligible. A subadiabatic layer is formed near the base of the convection zone due to continuous deposition of low entropy plumes in low-κ simulations. The most interesting result of our low-κ simulations is that the convective motions are accompanied by a change in the convection structure that is increasingly influenced by small-scale plumes. These plumes tend to transport angular momentum radially inward and thus establish an anti-solar differential rotation, in striking contrast to the solar rotation profile. If such low diffusive plumes, driven by the radiative-surface cooling, are present in the Sun, then our results cast doubt on the idea that a high effective Pr may be a viable solution to the solar convective conundrum. Our study also emphasizes that any resolution of the conundrum that relies on the downward plumes must take into account the angular momentum transport and heat transport.

7. A classifying method analysis on the number of returns for given pulse of post-earthquake airborne LiDAR data

NASA Astrophysics Data System (ADS)

Wang, Jinxia; Dou, Aixia; Wang, Xiaoqing; Huang, Shusong; Yuan, Xiaoxiang

2016-11-01

Compared to remote sensing image, post-earthquake airborne Light Detection And Ranging (LiDAR) point cloud data contains a high-precision three-dimensional information on earthquake disaster which can improve the accuracy of the identification of destroy buildings. However after the earthquake, the damaged buildings showed so many different characteristics that we can't distinguish currently between trees and damaged buildings points by the most commonly used method of pre-processing. In this study, we analyse the number of returns for given pulse of trees and damaged buildings point cloud and explore methods to distinguish currently between trees and damaged buildings points. We propose a new method by searching for a certain number of neighbourhood space and calculate the ratio(R) of points whose number of returns for given pulse greater than 1 of the neighbourhood points to separate trees from buildings. In this study, we select some point clouds of typical undamaged building, collapsed building and tree as samples from airborne LiDAR point cloud data which got after 2010 earthquake in Haiti MW7.0 by the way of human-computer interaction. Testing to get the Rvalue to distinguish between trees and buildings and apply the R-value to test testing areas. The experiment results show that the proposed method in this study can distinguish between building (undamaged and damaged building) points and tree points effectively but be limited in area where buildings various, damaged complex and trees dense, so this method will be improved necessarily.

8. How dynamic number of evacuee affects the multi-objective optimum allocation for earthquake emergency shelters: A case study in the central area of Beijing, China

NASA Astrophysics Data System (ADS)

Ma, Y.; Xu, W.; Zhao, X.; Qin, L.

2016-12-01

Accurate location and allocation of earthquake emergency shelters is a key component of effective urban planning and emergency management. A number of models have been developed to solve the complex location-allocation problem with diverse and strict constraints, but there still remain a big gap between the model and the actual situation because the uncertainty of earthquake, damage rate of buildings and evacuee behaviors have been neglected or excessively simplified in the existing models. An innovative model was first developed to estimate the hourly dynamic changes of the number of evacuees under two damage scenarios of earthquake by considering these factors at the community level based on a location-based service data, and then followed by a multi-objective model for the allocation of residents to earthquake shelters using the central area of Beijing, China as a case study. The two objectives of this shelter allocation model were to minimize the total evacuation distance from communities to a specified shelter and to minimize the total area of all the shelters with the constraints of shelter capacity and service radius. The modified particle swarm optimization algorithm was used to solve this model. The results show that increasing the shelter area will result in a large decrease of the total evacuation distance in all of the schemes of the four scenarios (i.e., Scenario A and B in daytime and nighttime respectively). According to the schemes of minimum distance, parts of communities in downtown area needed to be reallocated due to the insufficient capacity of the nearest shelters, and the numbers of these communities sequentially decreased in scenarios Ad, An, Bd and Bn due to the decreasing population. According to the schemes of minimum area in each scenario, 27 or 28 shelters, covering a total area of approximately 37 km2, were selected; and the communities almost evacuated using the same routes in different scenarios. The results can be used as a

9. Organizational changes at Earthquakes & Volcanoes

Gordon, David W.

1992-01-01

Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

10. Galaxy formation in Lambda greater than 0 Friedmann models: Consequences for the number counts versus redshift test

NASA Technical Reports Server (NTRS)

Martel, Hugo

1994-01-01

We study the effect of the cosmological constant Lambda on galaxy formation using a simple spherical top-hat overdensity model. We consider models with Omega(sub 0) = 0.2, lambda(sub 0) = 0, and Omega(sub 0) = 0.2, lambda(sub 0) = 0.8 (where Omega(sub 0) is the density parameter, and lambda(sub 0) identically equal Lambda/3 H(sub 0 exp 2) where H(sub 0) is the Hubble constant). We adjust the initial power spectrum amplitude so that both models reproduce the same large-scale structures. The galaxy formation era in the lambda(sub 0) = 0 model occurs early (z approximately 6) and is very short, whereas in the lambda(sub 0) = 0.8 model the galaxy formation era starts later (z approximately 4), and last much longer, possibly all the way to the present. Consequently, galaxies at low redshift (z less than 1) are significantly more evolved in the lambda(sub 0) = 0 model than in the lambda(sub 0) = 0.8 model. This result implies that previous attempts to determine Lambda using the number counts versus redshift test are probably unreliable.

11. Earthquakes; January-February, 1979

Person, W.J.

1979-01-01

The first major earthquake (magnitude 7.0 to 7.9) of the year struck in southeastern Alaska in a sparsely populated area on February 28. On January 16, Iran experienced the first destructive earthquake of the year causing a number of casualties and considerable damage. Peru was hit by a destructive earthquake on February 16 that left casualties and damage. A number of earthquakes were experienced in parts of the Untied States, but only minor damage was reported.

12. Earthquake risk assessment of Alexandria, Egypt

NASA Astrophysics Data System (ADS)

Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

2015-01-01

Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

13. Defeating Earthquakes

NASA Astrophysics Data System (ADS)

Stein, R. S.

2012-12-01

our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong.

14. IMMEDIATE MENTAL CONSEQUENCES OF THE GREAT EAST JAPAN EARTHQUAKE AND FUKUSHIMA NUCLEAR POWER PLANT ACCIDENT ON MOTHERS EXPERIENCING MISCARRIAGE, ABORTION, AND STILLBIRTH: THE FUKUSHIMA HEALTH MANAGEMENT SURVEY

PubMed Central

YOSHIDA-KOMIYA, HIROMI; GOTO, AYA; YASUMURA, SEIJI; FUJIMORI, KEIYA; ABE, MASAFUMI; FOR THE PREGNANCY AND BIRTH SURVEY GROUP OF THE FUKUSHIMA HEALTH MANAGEMENT SURVEY

2015-01-01

ABSTRACT Background: The Fukushima Pregnancy and Birth Survey was launched to monitor pregnant mothers’ health after the Great East Japan Earthquake and Fukushima Daiichi Nuclear Power Plant (NPP) accident. Several lines of investigations have indicated that a disaster impacts maternal mental health with childbirth. However, there is no research regarding mental health of mothers with fetal loss after a disaster. In this report, we focus on those women immediately after the Great East Japan Earthquake and Fukushima NPP accident and discuss their support needs. Materials and Methods: Data regarding 61 miscarriages, 5 abortions, and 22 stillbirths were analyzed among the women who were pregnant at the time of the accident in the present study. We used a two-item case-finding instrument for depression screening, and compared the childbirth group with the fetal loss groups. We also analyzed mothers’ opinions written as free-form text. Results: Among the three fetal loss groups, the proportion of positive depression screens was significantly higher in the miscarriage and stillbirth group than in the childbirth group. Mothers’ opinions were grouped into six categories, with pregnancy-related items being most common, especially in the miscarriage and stillbirth groups. Conclusion: A higher proportion of Fukushima mothers with fetal loss, especially those with miscarriage and stillbirth, had depressive symptoms compared to those who experienced childbirth. Health care providers need to pay close attention to this vulnerable group and respond to their concerns regarding the effects on their fertility. PMID:26063510

15. IMMEDIATE MENTAL CONSEQUENCES OF THE GREAT EAST JAPAN EARTHQUAKE AND FUKUSHIMA NUCLEAR POWER PLANT ACCIDENT ON MOTHERS EXPERIENCING MISCARRIAGE, ABORTION, AND STILLBIRTH: THE FUKUSHIMA HEALTH MANAGEMENT SURVEY.

PubMed

Yoshida-Komiya, Hiromi; Goto, Aya; Yasumura, Seiji; Fujimori, Keiya; Abe, Masafumi

2015-01-01

The Fukushima Pregnancy and Birth Survey was launched to monitor pregnant mothers' health after the Great East Japan Earthquake and Fukushima Daiichi Nuclear Power Plant (NPP) accident. Several lines of investigations have indicated that a disaster impacts maternal mental health with childbirth. However, there is no research regarding mental health of mothers with fetal loss after a disaster. In this report, we focus on those women immediately after the Great East Japan Earthquake and Fukushima NPP accident and discuss their support needs. Data regarding 61 miscarriages, 5 abortions, and 22 stillbirths were analyzed among the women who were pregnant at the time of the accident in the present study. We used a two-item case-finding instrument for depression screening, and compared the childbirth group with the fetal loss groups. We also analyzed mothers' opinions written as free-form text. Among the three fetal loss groups, the proportion of positive depression screens was significantly higher in the miscarriage and stillbirth group than in the childbirth group. Mothers' opinions were grouped into six categories, with pregnancy-related items being most common, especially in the miscarriage and stillbirth groups. A higher proportion of Fukushima mothers with fetal loss, especially those with miscarriage and stillbirth, had depressive symptoms compared to those who experienced childbirth. Health care providers need to pay close attention to this vulnerable group and respond to their concerns regarding the effects on their fertility.

16. Earthquakes, November-December 1992

Person, W.J.

1993-01-01

There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California.

17. Fear based Education or Curiosity based Education as an Example of Earthquake and Natural Disaster Education: Results of Statistical Study in Primary Schools in Istanbul-Turkey

NASA Astrophysics Data System (ADS)

Ozcep, T.; Ozcep, F.

2012-04-01

Natural disaster reduction focuses on the urgent need for prevention activities to reduce loss of life, damage to property, infrastructure and environment, and the social and economic disruption caused by natural hazards. One of the most important factors in reduction of the potential damage of earthquakes is trained manpower. To understanding the causes of earthquakes and other natural phenomena (landslides, avalanches, floods, volcanoes, etc.) is one of the pre-conditions to show a conscious behavior. The aim of the study is to analysis and to investigate, how earthquakes and other natural phenomena are perceived by the students and the possible consequences of this perception, and their effects of reducing earthquake damage. One of the crucial questions is that is our education system fear or curiosity based education system? Effects of the damages due to earthquakes have led to look like a fear subject. In fact, due to the results of the effects, the earthquakes are perceived scary phenomena. In the first stage of the project, the learning (or perception) levels of earthquakes and other natural disasters for the students of primary school are investigated with a survey. Aim of this survey study of earthquakes and other natural phenomena is that have the students fear based or curiosity based approaching to the earthquakes and other natural events. In the second stage of the project, the path obtained by the survey are evaluated with the statistical point of approach. A questionnaire associated with earthquakes and natural disasters are applied to primary school students (that total number of them is approximately 700 pupils) to measure the curiosity and/or fear levels. The questionnaire consists of 17 questions related to natural disasters. The questions are: "What is the Earthquake ?", "What is power behind earthquake?", "What is the mental response during the earthquake ?", "Did we take lesson from earthquake's results ?", "Are you afraid of earthquake

18. Turkish Compulsory Earthquake Insurance (TCIP)

NASA Astrophysics Data System (ADS)

Erdik, M.; Durukal, E.; Sesetyan, K.

2009-04-01

Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

19. The October 12, 1992, Dahshur, Egypt, Earthquake

Thenhaus, P.C.; Celebi, M.; Sharp, R.V.

1993-01-01

We were part of an international reconnaissance team that investigated the Dahsur earthquake. This article summarizes our findings and points out how even a relatively moderate sized earthquake can cause widespread damage and a large number of casualities.

20. Economic Effects of 1978 Tabas Earthquake (Iran).

PubMed

Zandian, Elham; Rimaz, Shahnaz; Holakouie Naieni, Kourosh; Nedjat, Saharnaz; Naderimagham, Shohreh; Larijani, Bagher; Farzadfar, Farshad

2016-06-01

Natural disasters are one of the most important adverse health events. The earthquake that happened in the city of Tabas in 1978 was ranked third in terms of number of deaths caused by natural disasters over the past 100 years in Iran. This study was aimed to evaluate the economic and human capital consequences of earthquake in Tabas district. We used a two percent random sample of Iran Census Dataset from 2006 to run a difference-in-difference study. The difference-in-difference methodology was used to evaluate (1) the mean changes in variables including years of schooling and wealth; (2) the odds changes in primary school completion and literacy of people born (5 or 10 years) post-event versus (5 or 10 years) pre-event in Tabas compared with the same values for those born in the same period of time in the control districts. Differential increase in years of schooling for being born 10 years after the earthquake versus in 10 years before earthquake in Tabas was one-third of a school year less than in the control districts. There were 89.5% and 65.4% decrease in odds that an individual is literate, and 0.26 and 0.104 average decrease in the SES index for those born in Tabas in periods of 5 and 10 years, respectively, compared with control districts. Tabas earthquake had negative long-term effects on human capital and wealth. This study can help official authorities to promote educational and economic plans and to implement comprehensive reforms in earthquake-stricken areas.

1. Earthquakes: Recurrence and Interoccurrence Times

NASA Astrophysics Data System (ADS)

Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.; Yakovlev, G.; Goltz, C.; Newman, W. I.

2008-04-01

The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.

2. Earthquake Loss Scenarios in the Himalayas

NASA Astrophysics Data System (ADS)

Wyss, M.; Gupta, S.; Rosset, P.; Chamlagain, D.

2017-12-01

We estimate quantitatively that in repeats of the 1555 and 1505 great Himalayan earthquakes the fatalities may range from 51K to 549K, the injured from 157K to 1,700K and the strongly affected population (Intensity≥VI) from 15 to 75 million, depending on the details of the assumed earthquake parameters. For up-dip ruptures in the stressed segments of the M7.8 Gorkha 2015, the M7.9 Subansiri 1947 and the M7.8 Kangra 1905 earthquakes, we estimate 62K, 100K and 200K fatalities, respectively. The numbers of strongly affected people we estimate as 8, 12, 33 million, in these cases respectively. These loss calculations are based on verifications of the QLARM algorithms and data set in the cases of the M7.8 Gorkha 2015, the M7.8 Kashmir 2005, the M6.6 Chamoli 1999, the M6.8 Uttarkashi 1991 and the M7.8 Kangra 1905 earthquakes. The requirement of verification that was fulfilled in these test cases was that the reported intensity field and the fatality count had to match approximately, using the known parameters of the earthquakes. The apparent attenuation factor was a free parameter and ranged within acceptable values. Numbers for population were adjusted for the years in question from the latest census. The hour of day was assumed to be at night with maximum occupation. The assumption that the upper half of the Main Frontal Thrust (MFT) will rupture in companion earthquakes to historic earthquakes in the down-dip half is based on the observations of several meters of displacement in trenches across the MFT outcrop. Among mitigation measures awareness with training and adherence to construction codes rank highest. Retrofitting of schools and hospitals would save lives and prevent injuries. Preparation plans for helping millions of strongly affected people should be put in place. These mitigation efforts should focus on an approximately 7 km wide strip along the MFT on the up-thrown side because the strong motions are likely to be doubled. We emphasize that our estimates

3. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

NASA Astrophysics Data System (ADS)

Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

2008-12-01

Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

4. Earthquakes, May-June 1981

Person, W.J.

1981-01-01

The months of May and June were somewhat quiet, seismically speaking. There was one major earthquake (7.0-7.9) off the west coast of South Island, New Zealand. The most destructive earthquake during this reporting period was in southern Iran on June 11 which caused fatalities and extensive damage. Peru also experienced a destructive earthquake on June 22 which caused fatalities and damage. In the United States, a number of earthquakes were experienced, but none caused significant damage.

5. Earthquake Hazards.

ERIC Educational Resources Information Center

Donovan, Neville

1979-01-01

Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

6. Do Earthquakes Shake Stock Markets?

PubMed Central

2015-01-01

This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

7. Do Earthquakes Shake Stock Markets?

PubMed

Ferreira, Susana; Karali, Berna

2015-01-01

This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

8. Earthquakes; July-August 1982

Person, W.J.

1983-01-01

During this reporting period, there were three major (7.0-7.9) earthquakes all in unpopulated areas. The quakes occurred north of Macquarie Island on July 7, in the Santa Cruz Islands on August 5, and south of Panama on August 19. In the United Stats, a number of earthquakes occurred, but no damage was reported.

9. Earthquakes, July-August, 1979

Person, W.J.

1980-01-01

In the United States, on August 6, central California experienced a moderately strong earthquake, which injured several people and caused some damage. A number of earthquakes occurred in other parts of the United States but caused very little damage.

10. Earthquake Safety Tips in the Classroom

NASA Astrophysics Data System (ADS)

Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M.

2014-12-01

The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in

11. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

Jaiswal, Kishor; Wald, David J.

2008-01-01

contribution of building stock, its relative vulnerability, and distribution are vital components for determining the extent of casualties during an earthquake. It is evident from large deadly historical earthquakes that the distribution of vulnerable structures and their occupancy level during an earthquake control the severity of human losses. For example, though the number of strong earthquakes in California is comparable to that of Iran, the total earthquake-related casualties in California during the last 100 years are dramatically lower than the casualties from several individual Iranian earthquakes. The relatively low casualties count in California is attributed mainly to the fact that more than 90 percent of the building stock in California is made of wood and is designed to withstand moderate to large earthquakes (Kircher, Seligson and others, 2006). In contrast, the 80 percent adobe and or non-engineered masonry building stock with poor lateral load resisting systems in Iran succumbs even for moderate levels of ground shaking. Consequently, the heavy death toll for the 2003 Bam, Iran earthquake, which claimed 31,828 lives (Ghafory-Ashtiany and Mousavi, 2005), is directly attributable to such poorly resistant construction, and future events will produce comparable losses unless practices change. Similarly, multistory, precast-concrete framed buildings caused heavy casualties in the 1988 Spitak, Armenia earthquake (Bertero, 1989); weaker masonry and reinforced-concrete framed construction designed for gravity loads with soft first stories dominated losses in the Bhuj, India earthquake of 2001 (Madabhushi and Haigh, 2005); and adobe and weak masonry dwellings in Peru controlled the death toll in the Peru earthquake of 2007 (Taucer, J. and others, 2007). Spence (2007) after conducting a brief survey of most lethal earthquakes since 1960 found that building collapses remains a major cause of earthquake mortality and unreinforced masonry buildings are one of the mos

12. Volcanotectonic earthquakes induced by propagating dikes

NASA Astrophysics Data System (ADS)

Gudmundsson, Agust

2016-04-01

Volcanotectonic earthquakes are of high frequency and mostly generated by slip on faults. During chamber expansion/contraction earthquakes are distribution in the chamber roof. Following magma-chamber rupture and dike injection, however, earthquakes tend to concentrate around the dike and follow its propagation path, resulting in an earthquake swarm characterised by a number of earthquakes of similar magnitudes. I distinguish between two basic processes by which propagating dikes induce earthquakes. One is due to stress concentration in the process zone at the tip of the dike, the other relates to stresses induced in the walls and surrounding rocks on either side of the dike. As to the first process, some earthquakes generated at the dike tip are related to pure extension fracturing as the tip advances and the dike-path forms. Formation of pure extension fractures normally induces non-double couple earthquakes. There is also shear fracturing in the process zone, however, particularly normal faulting, which produces double-couple earthquakes. The second process relates primarily to slip on existing fractures in the host rock induced by the driving pressure of the propagating dike. Such pressures easily reach 5-20 MPa and induce compressive and shear stresses in the adjacent host rock, which already contains numerous fractures (mainly joints) of different attitudes. In piles of lava flows or sedimentary beds the original joints are primarily vertical and horizontal. Similarly, the contacts between the layers/beds are originally horizontal. As the layers/beds become buried, the joints and contacts become gradually tilted so that the joints and contacts become oblique to the horizontal compressive stress induced by a driving pressure of the (vertical) dike. Also, most of the hexagonal (or pentagonal) columnar joints in the lava flows are, from the beginning, oblique to an intrusive sheet of any attitude. Consequently, the joints and contacts function as potential shear

13. [The number of deaths by suicide after the Great East Japan Earthquake based on demographic statistics in the coastal and non-coastal areas of Iwate, Miyagi, and Fukushima prefectures].

PubMed

Masaki, Naoko; Hashimoto, Shuji; Kawado, Miyuki; Ojima, Toshiyuki; Takeshima, Tadashi; Matsubara, Miyuki; Mitoku, Kazuko; Ogata, Yukiko

2018-01-01

Objective　The number of deaths by suicide after the Great East Japan Earthquake was surveyed based on demographic statistics. In particular, this study examined whether or not there were excessive deaths due to suicide (excluding people who were injured in the earthquake) after the Great East Japan Earthquake disaster. This examination surveyed municipalities in coastal and non-coastal areas of Iwate, Miyagi, and Fukushima prefectures (referred to below as the "three prefectures").Methods　The demographic statistics questionnaire survey information supplied by Article 33 of the Statistics Act (Ministry of Health, Labour and Welfare's published statistics Vol. 0925 No.4, September 25 th , 2014) were used as the basic data with particular reference to the information on the deaths from January 1 st , 2010 to March 31 st , 2013. The information obtained included the date of death, the municipality where the address of the deceased was registered, the gender of the deceased, age at the time of death, and cause of death codes (International Classification of Disease Codes 10 th revision: ICD-10). Additionally, information was gathered about the population based on the resident register from 2009 to 2013 and the 2010 National Census; the number of deaths by suicide was then totalled by period and area. The areas were classified as municipalities within three prefectures and those located elsewhere using the municipality where the address of the deceased was registered.Results　The SMR for suicides did not show a tendency to increase for coastal or non-coastal areas throughout the two-year period after the earthquake disaster (from March 2011 to February 2013).　The SMR for the three prefectures 0-1 years after the disaster compared with the year before the disaster was 0.92 and for 1-2 years after the disaster was 0.93. Both these values were significantly low. Looking at both the non-coastal and coastal areas from each of the three prefectures, the SMR for suicides

14. Earthquake Facts

MedlinePlus

... recordings of large earthquakes, scientists built large spring-pendulum seismometers in an attempt to record the long- ... are moving away from one another. The first “pendulum seismoscope” to measure the shaking of the ground ...

15. Earthquake watch

Hill, M.

1976-01-01

When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media.

16. [Earthquakes--a historical review, environmental and health effects, and health care measures].

PubMed

Nola, Iskra Alexandra; Doko Jelinić, Jagoda; Žuškin, Eugenija; Kratohvil, Mladen

2013-06-01

Earthquakes are natural disasters that can occur at any time, regardless of the location. Their frequency is higher in the Circum-Pacific and Mediterranean/Trans-Asian seismic belt. A number of sophisticated methods define their magnitude using the Richter scale and intensity using the Mercani-Cancani-Sieberg scale. Recorded data show a number of devastating earthquakes that have killed many people and changed the environment dramatically. Croatia is located in a seismically active area, which has endured a series of historical earthquakes, among which several occurred in the Zagreb area. The consequences of an earthquake depend mostly on the population density and seismic resistance of buildings in the affected area. Environmental consequences often include air, water, and soil pollution. The effects of this kind of pollution can have long-term health effects. The most dramatic health consequences result from the demolition of buildings. Therefore, quick and efficient aid depends on well-organized health professionals as well as on the readiness of the civil defence, fire department, and Mountain Rescue Service members. Good coordination among these services can save many lives Public health interventions must include effective control measures in the environment as secondary prevention methods for health problems caused by unfavourable environmental factors. The identification and control of long-term hazards can reduce chronic health effects. The reduction of earthquake-induced damages includes setting priorities in building seismically safe buildings.

17. The CATDAT damaging earthquakes database

NASA Astrophysics Data System (ADS)

Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

2011-08-01

The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

18. Earthquakes for Kids

MedlinePlus

... across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History A scientist stands in ...

19. 100 years after the Marsica earthquake: contribute of outreach activities

NASA Astrophysics Data System (ADS)

D'Addezio, Giuliana; Giordani, Azzurra; Valle, Veronica; Riposati, Daniela

2015-04-01

Many outreach events have been proposed by the scientific community to celebrate the Centenary of the January 13, 1915 earthquake, that devastated the Marsica territory, located in Central Apennines. The Laboratorio Divulgazione Scientifica e Attività Museali of the Istituto Nazionale di Geofisica e Vulcanologia (INGV's Laboratory for Outreach and Museum Activities) in Rome, has realised an interactive exhibition in the Castello Piccolomini, Celano (AQ), to retrace the many aspects of the earthquake disaster, in a region such as Abruzzo affected by several destructive earthquakes during its history. The initiatives represent an ideal opportunity for the development of new programs of communication and training on seismic risk and to spread the culture of prevention. The INGV is accredited with the Servizio Civile Nazionale (National Civic Service) and volunteers are involved in the project "Science and Outreach: a comprehensive approach to the divulgation of knowledge of Earth Sciences" starting in 2014. In this contest, volunteers had the opportunity to fully contribute to the exhibition, in particular, promoting and realising two panels concerning the social and environmental consequences of the Marsica earthquake. Describing the serious consequences of the earthquake, we may raise awareness about natural hazards and about the only effective action for earthquake defense: building with anti seismic criteria. After studies and researches conducted in libraries and via web, two themes have been developped: the serious problem of orphans and the difficult reconstruction. Heavy snowfalls and the presence of wolves coming from the high and wild surrounding mountains complicated the scenario and decelerated the rescue of the affected populations. It is important to underline that the earthquake was not the only devastating event in the country in 1915; another drammatic event was, in fact, the First World War. Whole families died and the still alive infants and

20. The physics of an earthquake

NASA Astrophysics Data System (ADS)

McCloskey, John

2008-03-01

The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.

1. Spatio-temporal Variations of Characteristic Repeating Earthquake Sequences along the Middle America Trench in Mexico

NASA Astrophysics Data System (ADS)

Dominguez, L. A.; Taira, T.; Hjorleifsdottir, V.; Santoyo, M. A.

2015-12-01

Repeating earthquake sequences are sets of events that are thought to rupture the same area on the plate interface and thus provide nearly identical waveforms. We systematically analyzed seismic records from 2001 through 2014 to identify repeating earthquakes with highly correlated waveforms occurring along the subduction zone of the Cocos plate. Using the correlation coefficient (cc) and spectral coherency (coh) of the vertical components as selection criteria, we found a set of 214 sequences whose waveforms exceed cc≥95% and coh≥95%. Spatial clustering along the trench shows large variations in repeating earthquakes activity. Particularly, the rupture zone of the M8.1, 1985 earthquake shows an almost absence of characteristic repeating earthquakes, whereas the Guerrero Gap zone and the segment of the trench close to the Guerrero-Oaxaca border shows a significantly larger number of repeating earthquakes sequences. Furthermore, temporal variations associated to stress changes due to major shows episodes of unlocking and healing of the interface. Understanding the different components that control the location and recurrence time of characteristic repeating sequences is a key factor to pinpoint areas where large megathrust earthquakes may nucleate and consequently to improve the seismic hazard assessment.

2. Earthquakes in El Salvador: a descriptive study of health concerns in a rural community and the clinical implications, part I.

PubMed

Woersching, Joanna C; Snyder, Audrey E

2003-01-01

This is the first article in a series that evaluates the health concerns of people living in a Salvadoran rural community after major earthquakes. Part I reviews the background, methods, and results of post-earthquake conditions with regards to healthcare, access to healthcare, housing, food, water and sanitation. Part II reviews the implications of these results and recommendations for improvements within the community. Part III investigates the psychosocial and mental health consequences of the earthquakes and provides suggestions for improved mental health awareness, assessment, and intervention. El Salvador experienced 2 major earthquakes in January and February 2001. This study evaluates the effects of the earthquakes on the health practices in the rural town of San Sebastian. The research was conducted with use of a convenience sample survey of subjects affected by the earthquakes. The sample included 594 people within 100 households. The 32-question survey assessed post-earthquake conditions in the areas of health care and access to care, housing, food and water, and sanitation. Communicable diseases affected a number of family members. After the earthquakes, 38% of households reported new injuries, and 79% reported acute exacerbations of chronic illness. Rural inhabitants were 30% more likely to have an uninhabitable home than were urban inhabitants. Concerns included safe housing, water purification, and waste elimination. The findings indicate a need for greater public health awareness and community action to adapt living conditions after a disaster and prevent the spread of communicable disease.

3. Insignificant solar-terrestrial triggering of earthquakes

Love, Jeffrey J.; Thomas, Jeremy N.

2013-01-01

We examine the claim that solar-terrestrial interaction, as measured by sunspots, solar wind velocity, and geomagnetic activity, might play a role in triggering earthquakes. We count the number of earthquakes having magnitudes that exceed chosen thresholds in calendar years, months, and days, and we order these counts by the corresponding rank of annual, monthly, and daily averages of the solar-terrestrial variables. We measure the statistical significance of the difference between the earthquake-number distributions below and above the median of the solar-terrestrial averages by χ2 and Student's t tests. Across a range of earthquake magnitude thresholds, we find no consistent and statistically significant distributional differences. We also introduce time lags between the solar-terrestrial variables and the number of earthquakes, but again no statistically significant distributional difference is found. We cannot reject the null hypothesis of no solar-terrestrial triggering of earthquakes.

4. Understanding Earthquakes

ERIC Educational Resources Information Center

Davis, Amanda; Gray, Ron

2018-01-01

December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

5. Earthquake Testing

NASA Technical Reports Server (NTRS)

1979-01-01

During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

6. PAGER-CAT: A composite earthquake catalog for calibrating global fatality models

Allen, T.I.; Marano, K.D.; Earle, P.S.; Wald, D.J.

2009-01-01

highly uncertain, particularly the casualty numbers, which must be regarded as estimates rather than firm numbers for many earthquakes. Consequently, we encourage contributions from the seismology and earthquake engineering communities to further improve this resource via the Wikipedia page and personal communications, for the benefit of the whole community.

7. Attention bias in earthquake-exposed survivors: an event-related potential study.

PubMed

Zhang, Yan; Kong, Fanchang; Han, Li; Najam Ul Hasan, Abbasi; Chen, Hong

2014-12-01

The Chinese Wenchuan earthquake, which happened on the 28th of May in 2008, may leave deep invisible scars in individuals. China has a large number of children and adolescents, who tend to be most vulnerable because they are in an early stage of human development and possible post-traumatic psychological distress may have a life-long consequence. Trauma survivors without post-traumatic stress disorder (PTSD) have received little attention in previous studies, especially in event-related potential (ERP) studies. We compared the attention bias to threat stimuli between the earthquake-exposed group and the control group in a masked version of the dot probe task. The target probe presented at the same space location consistent with earthquake-related words was the congruent trial, while in the space location of neutral words was the incongruent trial. Thirteen earthquake-exposed middle school students without PTSD and 13 matched controls were included in this investigation. The earthquake-exposed group showed significantly faster RTs to congruent trials than to incongruent trials. The earthquake-exposed group produced significantly shorter C1 and P1 latencies and larger C1, P1 and P2 amplitudes than the control group. In particular, enhanced P1 amplitude to threat stimuli was observed in the earthquake-exposed group. These findings are in agreement with the prediction that earthquake-exposed survivors have an attention bias to threat stimuli. The traumatic event had a much greater effect on earthquake-exposed survivors even if they showed no PTSD symptoms than individuals in the controls. These results will provide neurobiological evidences for effective intervention and prevention to post-traumatic mental problems. Copyright © 2014 Elsevier B.V. All rights reserved.

8. Fractal dynamics of earthquakes

SciT

Bak, P.; Chen, K.

1995-05-01

Many objects in nature, from mountain landscapes to electrical breakdown and turbulence, have a self-similar fractal spatial structure. It seems obvious that to understand the origin of self-similar structures, one must understand the nature of the dynamical processes that created them: temporal and spatial properties must necessarily be completely interwoven. This is particularly true for earthquakes, which have a variety of fractal aspects. The distribution of energy released during earthquakes is given by the Gutenberg-Richter power law. The distribution of epicenters appears to be fractal with dimension D {approx} 1--1.3. The number of after shocks decay as a function ofmore » time according to the Omori power law. There have been several attempts to explain the Gutenberg-Richter law by starting from a fractal distribution of faults or stresses. But this is a hen-and-egg approach: to explain the Gutenberg-Richter law, one assumes the existence of another power-law--the fractal distribution. The authors present results of a simple stick slip model of earthquakes, which evolves to a self-organized critical state. Emphasis is on demonstrating that empirical power laws for earthquakes indicate that the Earth`s crust is at the critical state, with no typical time, space, or energy scale. Of course the model is tremendously oversimplified; however in analogy with equilibrium phenomena they do not expect criticality to depend on details of the model (universality).« less

9. Update earthquake risk assessment in Cairo, Egypt

NASA Astrophysics Data System (ADS)

Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

2017-07-01

The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

10. Loss Estimations due to Earthquakes and Secondary Technological Hazards

NASA Astrophysics Data System (ADS)

Frolova, N.; Larionov, V.; Bonnin, J.

2009-04-01

Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.

11. The mechanism of earthquake

NASA Astrophysics Data System (ADS)

Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

2018-03-01

strength of crust rocks: The gravitational pressure can initiate the elasticity-plasticity transition in crust rocks. By calculating the depth dependence of elasticity-plasticity transition and according to the actual situation analysis, the behaviors of crust rocks can be categorized in three typical zones: elastic, partially plastic and fully plastic. As the proportion of plastic portion reaches about 10% in the partially plastic zone, plastic interconnection may occur and the variation of shear strength in rocks is mainly characterized by plastic behavior. The equivalent coefficient of friction for the plastic slip is smaller by an order of magnitude, or even less than that for brittle fracture, thus the shear strength of rocks by plastic sliding is much less than that by brittle breaking. Moreover, with increasing depth a number of other factors can further reduce the shear yield strength of rocks. On the other hand, since earthquake is a large-scale damage, the rock breaking must occur along the weakest path. Therefore, the actual fracture strength of rocks in a shallow earthquake is assuredly lower than the average shear strength of rocks as generally observed. The typical distributions of the average strength and actual fracture strength in crustal rocks varying with depth are schematically illustrated. (3) The conditions for earthquake occurrence and mechanisms of earthquake: An earthquake will lead to volume expansion, and volume expansion must break through the obstacle. The condition for an earthquake to occur is as follows: the tectonic force exceeds the sum of the fracture strength of rock, the friction force of fault boundary and the resistance from obstacles. Therefore, the shallow earthquake is characterized by plastic sliding of rocks that break through the obstacles. Accordingly, four possible patterns for shallow earthquakes are put forward. Deep-focus earthquakes are believed to result from a wide-range rock flow that breaks the jam. Both shallow

12. Connecting slow earthquakes to huge earthquakes.

PubMed

Obara, Kazushige; Kato, Aitaro

2016-07-15

Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

13. 2010 Chile Earthquake Aftershock Response

NASA Astrophysics Data System (ADS)

Barientos, Sergio

2010-05-01

1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

14. Adolescent Fertility--Risks and Consequences. George Washington University, Department of Medical and Public Affairs Population Reports, Series J, Number 10, July 1976. Family Planning Programs.

ERIC Educational Resources Information Center

Hunt, William B., II

Throughout the world pregnancy and childbearing are occurring at younger ages than in the past, resulting in adverse health, demographic and social consequences. Postponing first births until age 20 or later would significantly reduce maternal and infant mortality and morbidity, slow population growth, and contribute to improvements in the quality…

15. Urban Earthquakes - Reducing Building Collapse Through Education

NASA Astrophysics Data System (ADS)

Bilham, R.

2004-12-01

Fatalities from earthquakes rose from 6000k to 9000k/year in the past decade, yet the ratio of numbers of earthquake fatalities to instantaneous population continues to fall. Since 1950 the ratio declined worldwide by a factor of three, but in some countries the ratio has changed little. E.g in Iran, 1 in 3000 people can expect to die in an earthquake, a percentage that has not changed significantly since 1890. Fatalities from earthquakes remain high in those countries that have traditionally suffered from frequent large earthquakes (Turkey, Iran, Japan, and China), suggesting that the exposure time of recently increased urban populations in other countries may be too short to have interacted with earthquakes with long recurrence intervals. This in turn, suggests that disasters of unprecendented size will occur (more than 1 million fatalities) when future large earthquakes occur close to megacities. However, population growth is most rapid in cities of less than 1 million people in the developing nations, where the financial ability to implement earthquake resistant construction methods is limited. In that structural collapse can often be traced to ignorance about the forces at work in an earthquake, the future collapse of buildings presently under construction could be much reduced were contractors, builders and occupants educated in the principles of earthquake resistant assembly. Education of builders who are tempted to cut assembly costs is likely to be more cost effective than material aid.

16. Twitter earthquake detection: Earthquake monitoring in a social world

Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

2011-01-01

The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

17. Evidence for Ancient Mesoamerican Earthquakes

NASA Astrophysics Data System (ADS)

Kovach, R. L.; Garcia, B.

2001-12-01

Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

18. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

NASA Astrophysics Data System (ADS)

Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

2018-01-01

Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

19. Global earthquake fatalities and population

Holzer, Thomas L.; Savage, James C.

2013-01-01

Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

20. Filling a gap: Public talks about earthquake preparation and the 'Big One'

NASA Astrophysics Data System (ADS)

Reinen, L. A.

2013-12-01

Residents of southern California are aware they live in a seismically active area and earthquake drills have trained us to Duck-Cover-Hold On. While many of my acquaintance are familiar with what to do during an earthquake, few have made preparations for living with the aftermath of a large earthquake. The ShakeOut Scenario (Jones et al., USGS Open File Report 2008-1150) describes the physical, social, and economic consequences of a plausible M7.8 earthquake on the southernmost San Andreas Fault. While not detailing an actual event, the ShakeOut Scenario illustrates how individual and community preparation may improve the potential after-affects of a major earthquake in the region. To address the gap between earthquake drills and preparation in my community, for the past several years I have been giving public talks to promote understanding of: the science behind the earthquake predictions; why individual, as well as community, preparation is important; and, ways in which individuals can prepare their home and work environments. The public presentations occur in an array of venues, including elementary school and college classes, a community forum linked with the annual ShakeOut Drill, and local businesses including the local microbrewery. While based on the same fundamental information, each presentation is modified for audience and setting. Assessment of the impact of these talks is primarily anecdotal and includes an increase in the number of venues requesting these talks, repeat invitations, and comments from audience members (sometimes months or years after a talk). I will present elements of these talks, the background information used, and examples of how they have affected change in the earthquake preparedness of audience members. Discussion and suggestions (particularly about effective means of conducting rigorous long-term assessment) are strongly encouraged.

1. A smartphone application for earthquakes that matter!

NASA Astrophysics Data System (ADS)

Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

2014-05-01

level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

2. Earthquakes in Arkansas and vicinity 1699-2010

Dart, Richard L.; Ausbrooks, Scott M.

2011-01-01

This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

3. Fault lubrication during earthquakes.

PubMed

Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

2011-03-24

The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

4. Induced earthquake magnitudes are as large as (statistically) expected

Van Der Elst, Nicholas; Page, Morgan T.; Weiser, Deborah A.; Goebel, Thomas; Hosseini, S. Mehran

2016-01-01

A major question for the hazard posed by injection-induced seismicity is how large induced earthquakes can be. Are their maximum magnitudes determined by injection parameters or by tectonics? Deterministic limits on induced earthquake magnitudes have been proposed based on the size of the reservoir or the volume of fluid injected. However, if induced earthquakes occur on tectonic faults oriented favorably with respect to the tectonic stress field, then they may be limited only by the regional tectonics and connectivity of the fault network. In this study, we show that the largest magnitudes observed at fluid injection sites are consistent with the sampling statistics of the Gutenberg-Richter distribution for tectonic earthquakes, assuming no upper magnitude bound. The data pass three specific tests: (1) the largest observed earthquake at each site scales with the log of the total number of induced earthquakes, (2) the order of occurrence of the largest event is random within the induced sequence, and (3) the injected volume controls the total number of earthquakes rather than the total seismic moment. All three tests point to an injection control on earthquake nucleation but a tectonic control on earthquake magnitude. Given that the largest observed earthquakes are exactly as large as expected from the sampling statistics, we should not conclude that these are the largest earthquakes possible. Instead, the results imply that induced earthquake magnitudes should be treated with the same maximum magnitude bound that is currently used to treat seismic hazard from tectonic earthquakes.

5. Identification of Deep Earthquakes

DTIC Science & Technology

2010-09-01

discriminants that will reliably separate small, crustal earthquakes (magnitudes less than about 4 and depths less than about 40 to 50 km) from small...characteristics on discrimination plots designed to separate nuclear explosions from crustal earthquakes. Thus, reliably flagging these small, deep events is...Further, reliably identifying subcrustal earthquakes will allow us to eliminate deep events (previously misidentified as crustal earthquakes) from

6. New ideas about the physics of earthquakes

NASA Astrophysics Data System (ADS)

Rundle, John B.; Klein, William

1995-07-01

It may be no exaggeration to claim that this most recent quaddrenium has seen more controversy and thus more progress in understanding the physics of earthquakes than any in recent memory. The most interesting development has clearly been the emergence of a large community of condensed matter physicists around the world who have begun working on the problem of earthquake physics. These scientists bring to the study of earthquakes an entirely new viewpoint, grounded in the physics of nucleation and critical phenomena in thermal, magnetic, and other systems. Moreover, a surprising technology transfer from geophysics to other fields has been made possible by the realization that models originally proposed to explain self-organization in earthquakes can also be used to explain similar processes in problems as disparate as brain dynamics in neurobiology (Hopfield, 1994), and charge density waves in solids (Brown and Gruner, 1994). An entirely new sub-discipline is emerging that is focused around the development and analysis of large scale numerical simulations of the dynamics of faults. At the same time, intriguing new laboratory and field data, together with insightful physical reasoning, has led to significant advances in our understanding of earthquake source physics. As a consequence, we can anticipate substantial improvement in our ability to understand the nature of earthquake occurrence. Moreover, while much research in the area of earthquake physics is fundamental in character, the results have many potential applications (Cornell et al., 1993) in the areas of earthquake risk and hazard analysis, and seismic zonation.

7. Identified EM Earthquake Precursors

NASA Astrophysics Data System (ADS)

Jones, Kenneth, II; Saxton, Patrick

2014-05-01

Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

8. Triggering of repeating earthquakes in central California

Wu, Chunquan; Gomberg, Joan; Ben-Naim, Eli; Johnson, Paul

2014-01-01

Dynamic stresses carried by transient seismic waves have been found capable of triggering earthquakes instantly in various tectonic settings. Delayed triggering may be even more common, but the mechanisms are not well understood. Catalogs of repeating earthquakes, earthquakes that recur repeatedly at the same location, provide ideal data sets to test the effects of transient dynamic perturbations on the timing of earthquake occurrence. Here we employ a catalog of 165 families containing ~2500 total repeating earthquakes to test whether dynamic perturbations from local, regional, and teleseismic earthquakes change recurrence intervals. The distance to the earthquake generating the perturbing waves is a proxy for the relative potential contributions of static and dynamic deformations, because static deformations decay more rapidly with distance. Clear changes followed the nearby 2004 Mw6 Parkfield earthquake, so we study only repeaters prior to its origin time. We apply a Monte Carlo approach to compare the observed number of shortened recurrence intervals following dynamic perturbations with the distribution of this number estimated for randomized perturbation times. We examine the comparison for a series of dynamic stress peak amplitude and distance thresholds. The results suggest a weak correlation between dynamic perturbations in excess of ~20 kPa and shortened recurrence intervals, for both nearby and remote perturbations.

9. Investigating Lushan Earthquake Victims' Individual Behavior Response and Rescue Organization.

PubMed

Kang, Peng; Lv, Yipeng; Deng, Qiangyu; Liu, Yuan; Zhang, Yi; Liu, Xu; Zhang, Lulu

2017-12-11

Research concerning the impact of earthquake victims' individual behavior and its association with earthquake-related injuries is lacking. This study examined this relationship along with effectiveness of earthquake rescue measures. The six most severely destroyed townships during the Lushan earthquake were examined; 28 villages and three earthquake victims' settlement camp areas were selected as research areas. Inclusion criteria comprised living in Lushan county for a longtime, living in Lushan county during the 2013 Lushan earthquake, and having one's home destroyed. Earthquake victims with an intellectual disability or communication problems were excluded. The earthquake victims (N (number) = 5165, male = 2396) completed a questionnaire (response rate: 94.7%). Among them, 209 were injured (5.61%). Teachers (p < 0.0001, OR (odds ratios) = 3.33) and medical staff (p = 0.001, OR = 4.35) were more vulnerable to the earthquake than were farmers. Individual behavior was directly related to injuries, such as the first reaction after earthquake and fear. There is an obvious connection between earthquake-related injury and individual behavior characteristics. It is strongly suggested that victims receive mental health support from medical practitioners and the government to minimize negative effects. The initial reaction after an earthquake also played a vital role in victims' trauma; therefore, earthquake-related experience and education may prevent injuries. Self-aid and mutual help played key roles in emergency, medical rescue efforts.

10. Earthquakes in Mississippi and vicinity 1811-2010

Dart, Richard L.; Bograd, Michael B.E.

2011-01-01

This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

11. Comparing population exposure to multiple Washington earthquake scenarios for prioritizing loss estimation studies

Wood, Nathan J.; Ratliff, Jamie L.; Schelling, John; Weaver, Craig S.

2014-01-01

Scenario-based, loss-estimation studies are useful for gauging potential societal impacts from earthquakes but can be challenging to undertake in areas with multiple scenarios and jurisdictions. We present a geospatial approach using various population data for comparing earthquake scenarios and jurisdictions to help emergency managers prioritize where to focus limited resources on data development and loss-estimation studies. Using 20 earthquake scenarios developed for the State of Washington (USA), we demonstrate how a population-exposure analysis across multiple jurisdictions based on Modified Mercalli Intensity (MMI) classes helps emergency managers understand and communicate where potential loss of life may be concentrated and where impacts may be more related to quality of life. Results indicate that certain well-known scenarios may directly impact the greatest number of people, whereas other, potentially lesser-known, scenarios impact fewer people but consequences could be more severe. The use of economic data to profile each jurisdiction’s workforce in earthquake hazard zones also provides additional insight on at-risk populations. This approach can serve as a first step in understanding societal impacts of earthquakes and helping practitioners to efficiently use their limited risk-reduction resources.

12. Brain scaling in mammalian evolution as a consequence of concerted and mosaic changes in numbers of neurons and average neuronal cell size

PubMed Central

Herculano-Houzel, Suzana; Manger, Paul R.; Kaas, Jon H.

2014-01-01

Enough species have now been subject to systematic quantitative analysis of the relationship between the morphology and cellular composition of their brain that patterns begin to emerge and shed light on the evolutionary path that led to mammalian brain diversity. Based on an analysis of the shared and clade-specific characteristics of 41 modern mammalian species in 6 clades, and in light of the phylogenetic relationships among them, here we propose that ancestral mammal brains were composed and scaled in their cellular composition like modern afrotherian and glire brains: with an addition of neurons that is accompanied by a decrease in neuronal density and very little modification in glial cell density, implying a significant increase in average neuronal cell size in larger brains, and the allocation of approximately 2 neurons in the cerebral cortex and 8 neurons in the cerebellum for every neuron allocated to the rest of brain. We also propose that in some clades the scaling of different brain structures has diverged away from the common ancestral layout through clade-specific (or clade-defining) changes in how average neuronal cell mass relates to numbers of neurons in each structure, and how numbers of neurons are differentially allocated to each structure relative to the number of neurons in the rest of brain. Thus, the evolutionary expansion of mammalian brains has involved both concerted and mosaic patterns of scaling across structures. This is, to our knowledge, the first mechanistic model that explains the generation of brains large and small in mammalian evolution, and it opens up new horizons for seeking the cellular pathways and genes involved in brain evolution. PMID:25157220

13. Seismicity map tools for earthquake studies

NASA Astrophysics Data System (ADS)

Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

2014-05-01

We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

14. Consequence Management Symposium

DTIC Science & Technology

2001-09-01

AND SUBTITLE Consequence Management Symposium 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...log i cal agents and their effects was deemed essen tial for “first respond ers,” includ ing emer gency medi cal and hospi tal prac ti tio ners

15. Earthquakes: Predicting the unpredictable?

Hough, Susan E.

2005-01-01

The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

16. Earthquakes: hydrogeochemical precursors

Ingebritsen, Steven E.; Manga, Michael

2014-01-01

Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

17. Comparison of hypocentre parameters of earthquakes in the Aegean region

NASA Astrophysics Data System (ADS)

Özel, Nurcan M.; Shapira, Avi; Harris, James

2007-06-01

The Aegean Sea is one of the more seismically active areas in the Euro-Mediterranean region. The seismic activity in the Aegean Sea is monitored by a number of local agencies that contribute their data to the International Seismological Centre (ISC). Consequently, the ISC Bulletin may serve as a reliable reference for assessing the capabilities of local agencies to monitor moderate and low magnitude earthquakes. We have compared bulletins of the Kandilli Observatory and Earthquake Research Institute (KOERI) and the ISC, for the period 1976-2003 that comprises the most complete data sets for both KOERI and ISC. The selected study area is the East Aegean Sea and West Turkey, bounded by latitude 35-41°N and by longitude 24-29°E. The total number of events known to occur in this area, during 1976-2003 is about 41,638. Seventy-two percent of those earthquakes were located by ISC and 75% were located by KOERI. As expected, epicentre location discrepancy between ISC and KOERI solutions are larger as we move away from the KOERI seismic network. Out of the 22,066 earthquakes located by both ISC and KOERI, only 4% show a difference of 50 km or more. About 140 earthquakes show a discrepancy of more than 100 km. Focal Depth determinations differ mainly in the subduction zone along the Hellenic arc. Less than 2% of the events differ in their focal depth by more than 25 km. Yet, the location solutions of about 30 events differ by more than 100 km. Almost a quarter of the events listed in the ISC Bulletin are missed by KOERI, most of them occurring off the coast of Turkey, in the East Aegean. Based on the frequency-magnitude distributions, the KOERI Bulletin is complete for earthquakes with duration magnitudes Md > 2.7 (both located and assigned magnitudes) where as the threshold magnitude for events with location and magnitude determinations by ISC is mb > 4.0. KOERI magnitudes seem to be poorly correlated with ISC magnitudes suggesting relatively high uncertainty in the

18. Redefining Earthquakes and the Earthquake Machine

ERIC Educational Resources Information Center

Hubenthal, Michael; Braile, Larry; Taber, John

2008-01-01

The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

19. Important Earthquake Engineering Resources

PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering

20. 2016 National Earthquake Conference

Thank you to our Presenting Sponsor, California Earthquake Authority. What's New? What's Next ? What's Your Role in Building a National Strategy? The National Earthquake Conference (NEC) is a , state government leaders, social science practitioners, U.S. State and Territorial Earthquake Managers

1. Can We Predict Earthquakes?

Johnson, Paul

2018-01-16

The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes â and when.

2. Earthquake and Schools. [Videotape].

ERIC Educational Resources Information Center

Federal Emergency Management Agency, Washington, DC.

Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

3. Children's Ideas about Earthquakes

ERIC Educational Resources Information Center

Simsek, Canan Lacin

2007-01-01

Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

4. Hypothesis testing and earthquake prediction.

PubMed

Jackson, D D

1996-04-30

Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

5. Hypothesis testing and earthquake prediction.

PubMed Central

Jackson, D D

1996-01-01

Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

6. Operational earthquake forecasting can enhance earthquake preparedness

Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

2014-01-01

We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

7. Discussion of New Approaches to Medium-Short-Term Earthquake Forecast in Practice of The Earthquake Prediction in Yunnan

NASA Astrophysics Data System (ADS)

Hong, F.

2017-12-01

After retrospection of years of practice of the earthquake prediction in Yunnan area, it is widely considered that the fixed-point earthquake precursory anomalies mainly reflect the field information. The increase of amplitude and number of precursory anomalies could help to determine the original time of earthquakes, however it is difficult to obtain the spatial relevance between earthquakes and precursory anomalies, thus we can hardly predict the spatial locations of earthquakes using precursory anomalies. The past practices have shown that the seismic activities are superior to the precursory anomalies in predicting earthquakes locations, resulting from the increased seismicity were observed before 80% M=6.0 earthquakes in Yunnan area. While the mobile geomagnetic anomalies are turned out to be helpful in predicting earthquakes locations in recent year, for instance, the forecasted earthquakes occurring time and area derived form the 1-year-scale geomagnetic anomalies before the M6.5 Ludian earthquake in 2014 are shorter and smaller than which derived from the seismicity enhancement region. According to the past works, the author believes that the medium-short-term earthquake forecast level, as well as objective understanding of the seismogenic mechanisms, could be substantially improved by the densely laying observation array and capturing the dynamic process of physical property changes in the enhancement region of medium to small earthquakes.

8. Hiding earthquakes from scrupulous monitoring eyes of dense local seismic networks

NASA Astrophysics Data System (ADS)

Bogiatzis, P.; Ishii, M.; Kiser, E.

2012-12-01

Accurate and complete cataloguing of aftershocks is essential for a variety of purposes, including the estimation of the mainshock rupture area, the identification of seismic gaps, and seismic hazard assessment. However, immediately following large earthquakes, the seismograms recorded by local networks are noisy, with energy arriving from hundreds of aftershocks, in addition to different seismic phases interfering with one another. This causes deterioration in the performance of detection and location of earthquakes using conventional methods such as the S-P approach. This is demonstrated by results of back-projection analysis of teleseismic data showing that a significant number of events are undetected by the Japan Meteorological Agency, within the first twenty-four hours after the Mw9.0 Tohoku-oki, Japan earthquake. The spatial distribution of the hidden events is not arbitrary. Most of these earthquakes are located close to the trench, while some are located at the outer rise. Furthermore, there is a relatively sharp trench-parallel boundary separating the detected and undetected events. We investigate the cause of these hidden earthquakes using forward modeling. The calculation of raypaths for various source locations and takeoff angles with the "shooting" method suggests that this phenomenon is a consequence of the complexities associated with subducting slab. Laterally varying velocity structure defocuses the seismic energy from shallow earthquakes located near the trench and makes the observation of P and S arrivals difficult at stations situated on mainland Japan. Full waveform simulations confirm these results. Our forward calculations also show that the probability of detection is sensitive to the depth of the event. Shallower events near the trench are more difficult to detect than deeper earthquakes that are located inside the subducting plate for which the shadow-zone effect diminishes. The modeling effort is expanded to include three

9. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

NASA Astrophysics Data System (ADS)

Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.

2012-11-01

results in a margin stratigraphy of minor MTDs compared to the turbidite-system deposits. In contrast, the MTDs and turbidites are equally intermixed on basin floors along passive margins with a mud-rich continental slope, such as the northern Gulf of Mexico. Great earthquakes also result in characteristic seismo-turbidite lithology. Along the Cascadia margin, the number and character of multiple coarse pulses for correlative individual turbidites generally remain constant both upstream and downstream in different channel systems for 600 km along the margin. This suggests that the earthquake shaking or aftershock signature is normally preserved, for the stronger (Mw ≥ 9) Cascadia earthquakes. In contrast, the generally weaker (Mw = or <8) California earthquakes result in upstream simple fining-up turbidites in single tributary canyons and channels; however, downstream mainly stacked turbidites result from synchronously triggered multiple turbidity currents that deposit in channels below confluences of the tributaries. Consequently, both downstream channel confluences and the strongest (Mw ≥ 9) great earthquakes contribute to multi-pulsed and stacked turbidites that are typical for seismo-turbidites generated by a single great earthquake. Earthquake triggering and multi-pulsed or stacked turbidites also become an alternative explanation for amalgamated turbidite beds in active tectonic margins, in addition to other classic explanations. The sedimentologic characteristics of turbidites triggered by great earthquakes along the Cascadia and northern California margins provide criteria to help distinguish seismo-turbidites in other active tectonic margins.

10. PAGER - Rapid Assessment of an Earthquake's Impact

Earle, Paul S.; Wald, David J.

2007-01-01

PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system to rapidly assess the number of people and regions exposed to severe shaking by an earthquake, and inform emergency responders, government agencies, and the media to the scope of the potential disaster. PAGER monitors the U.S. Geological Survey?s near real-time U.S. and global earthquake detections and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts.

11. Crowdsourced earthquake early warning

PubMed Central

Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

2015-01-01

Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

12. Crowdsourced earthquake early warning.

PubMed

Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

2015-04-01

Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

13. Crowdsourced earthquake early warning

Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

2015-01-01

Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

14. Gravity drives Great Earthquakes

NASA Astrophysics Data System (ADS)

Lister, Gordon; Forster, Marnie

2010-05-01

The most violent of Great Earthquakes are driven by ruptures on giant megathrusts adjacent to actively forming mountain belts. Current theory suggests that the seismic rupture harvests (and thus releases) elastic energy that has been previously stored in locked segments of the megathrust. The general belief, however, is that this energy was accumulated as the result of relative motion of the adjacent stiff elastic tectonic plates. This mechanism fails to explain many first order aspects of large earthquakes, however. The energy source for strain accumulation must also include gravitational collapse of orogenic crust and/or in the foundering (or roll-back) of an adjacent subducting lithospheric slab. Therefore we have conducted an analysis of the geometry of aftershocks, and report that this allows distinction of two types of failure on giant megathrusts. Mode I failure involves horizontal shortening, and is consistent with the classic view that megathrusts fail in compression, with motion analogous to that expected if accretion takes place against a rigid (or elastic) backstop. Mode II failure involves horizontal extension, and requires the over-riding plate to stretch during an earthquake. This process is likely to continue during the subsequent period of afterslip, and therefore will again be evident in aftershock patterns. Mode I behaviour may well have applied to the southern segment of the Sumatran megathrust, from whence emanated the rupture that drove the 2004 Great Earthquake. Mode II behaviour appears to apply to the northern segment of the same rupture, however. The geometry of aftershocks beneath the Andaman Sea suggest that the crust above the initial rupture failed in an extensional mode. The edge of the Indian plate is foundering, with slab-hinge roll-back in a direction orthogonal to its motion vector. The only possible cause for this extension therefore is westward roll-back of the subducting Indian plate, and the consequent gravity-driven movement

15. Sichuan Earthquake in China

NASA Technical Reports Server (NTRS)

2008-01-01

The Sichuan earthquake in China occurred on May 12, 2008, along faults within the mountains, but near and almost parallel the mountain front, northwest of the city of Chengdu. This major quake caused immediate and severe damage to many villages and cities in the area. Aftershocks pose a continuing danger, but another continuing hazard is the widespread occurrence of landslides that have formed new natural dams and consequently new lakes. These lakes are submerging roads and flooding previously developed lands. But an even greater concern is the possible rapid release of water as the lakes eventually overflow the new dams. The dams are generally composed of disintegrated rock debris that may easily erode, leading to greater release of water, which may then cause faster erosion and an even greater release of water. This possible 'positive feedback' between increasing erosion and increasing water release could result in catastrophic debris flows and/or flooding. The danger is well known to the Chinese earthquake response teams, which have been building spillways over some of the new natural dams.

This ASTER image, acquired on June 1, 2008, shows two of the new large landslide dams and lakes upstream from the town of Chi-Kua-Kan at 32o12'N latitude and 104o50'E longitude. Vegetation is green, water is blue, and soil is grayish brown in this enhanced color view. New landslides appear bright off-white. The northern (top) lake is upstream from the southern lake. Close inspection shows a series of much smaller lakes in an elongated 'S' pattern along the original stream path. Note especially the large landslides that created the dams. Some other landslides in this area, such as the large one in the northeast corner of the image, occur only on the mountain slopes, so do not block streams, and do not form lakes.

16. Living on an Active Earth: Perspectives on Earthquake Science

NASA Astrophysics Data System (ADS)

Lay, Thorne

2004-02-01

The annualized long-term loss due to earthquakes in the United States is now estimated at \$4.4 billion per year. A repeat of the 1923 Kanto earthquake, near Tokyo, could cause direct losses of \$2-3 trillion. With such grim numbers, which are guaranteed to make you take its work seriously, the NRC Committee on the Science of Earthquakes begins its overview of the emerging multidisciplinary field of earthquake science. An up-to-date and forward-looking survey of scientific investigation of earthquake phenomena and engineering response to associated hazards is presented at a suitable level for a general educated audience. Perspectives from the fields of seismology, geodesy, neo-tectonics, paleo-seismology, rock mechanics, earthquake engineering, and computer modeling of complex dynamic systems are integrated into a balanced definition of earthquake science that has never before been adequately articulated.

17. Prediction of earthquake-triggered landslide event sizes

NASA Astrophysics Data System (ADS)

Braun, Anika; Havenith, Hans-Balder; Schlögel, Romy

2016-04-01

Seismically induced landslides are a major environmental effect of earthquakes, which may significantly contribute to related losses. Moreover, in paleoseismology landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes and thus allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. We present here a review of factors contributing to earthquake triggered slope failures based on an "event-by-event" classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, 'Intensity', 'Fault', 'Topographic energy', 'Climatic conditions' and 'Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. The relative weight of these factors was extracted from published data for numerous past earthquakes; topographic inputs were checked in Google Earth and through geographic information systems. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be cross-checked. One of our main findings is that the 'Fault' factor, which is based on characteristics of the fault, the surface rupture and its location with respect to mountain areas, has the most important

18. Using Smartphones to Detect Earthquakes

NASA Astrophysics Data System (ADS)

Kong, Q.; Allen, R. M.

2012-12-01

We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

19. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

NASA Astrophysics Data System (ADS)

Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

2015-06-01

The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

20. Missing great earthquakes

Hough, Susan E.

2013-01-01

The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

1. Prototype operational earthquake prediction system

Spall, Henry

1986-01-01

An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

2. Napa earthquake: An earthquake in a highly connected world

NASA Astrophysics Data System (ADS)

Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

2014-12-01

The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

3. Automatic Earthquake Detection by Active Learning

NASA Astrophysics Data System (ADS)

Bergen, K.; Beroza, G. C.

2017-12-01

In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

4. Earthquake Protection Measures for People with Disabilities

NASA Astrophysics Data System (ADS)

Gountromichou, C.; Kourou, A.; Kerpelis, P.

2009-04-01

The problem of seismic safety for people with disabilities not only exists but is also urgent and of primary importance. Working towards disability equality, Earthquake Planning and Protection Organization of Greece (E.P.P.O.) has developed an educational scheme for people with disabilities in order to guide them to develop skills to protect themselves as well as to take the appropriate safety measures before, during and after an earthquake. The framework of this initiative includes a number of actions have been already undertaken, including the following: a. Recently, the main guidelines have been published to help people who have physical, cognitive, visual, or auditory disabilities to cope with a destructive earthquake. Of great importance, in case of people with disabilities, is to be prepared for the disaster, with several measures that must be taken starting today. In the pre-earthquake period, it is important that these people, in addition to other measures, do the following: - Create a Personal Support Network The Personal Support Network should be a group of at least three trustful people that can assist the disabled person to prepare for a disastrous event and to recover after it. - Complete a Personal Assessment The environment may change after a destructive earthquake. People with disabilities are encouraged to make a list of their personal needs and their resources for meeting them in a disaster environment. b. Lectures and training seminars on earthquake protection are given for students, teachers and educators in Special Schools for disabled people, mainly for informing and familiarizing them with earthquakes and with safety measures. c. Many earthquake drills have already taken place, for each disability, in order to share good practices and lessons learned to further disaster reduction and to identify gaps and challenges. The final aim of this action is all people with disabilities to be well informed and motivated towards a culture of earthquake

5. Review of Injuries from Terrorist Bombings and Earthquakes

DTIC Science & Technology

2016-08-31

distribution is unlimited. August 2016 Review of Injuries from Terrorist Bombings and Earthquakes DTRA-TR-16-064 T E C H N IC A L R E P...08-16 Technical Report 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Review of Injuries Types from Terrorist Bombings and Earthquakes HDTRA1-14-D-0003...Terrorist bombings and earthquakes provide valuable insight on the types of injuries that may occur in an improvised nuclear device (IND) scenario

6. Earthquakes in Alaska

Haeussler, Peter J.; Plafker, George

1995-01-01

Earthquake risk is high in much of the southern half of Alaska, but it is not the same everywhere. This map shows the overall geologic setting in Alaska that produces earthquakes. The Pacific plate (darker blue) is sliding northwestward past southeastern Alaska and then dives beneath the North American plate (light blue, green, and brown) in southern Alaska, the Alaska Peninsula, and the Aleutian Islands. Most earthquakes are produced where these two plates come into contact and slide past each other. Major earthquakes also occur throughout much of interior Alaska as a result of collision of a piece of crust with the southern margin.

7. Frog Swarms: Earthquake Precursors or False Alarms?

PubMed Central

Grant, Rachel A.; Conlan, Hilary

2013-01-01

juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co-incidence. Statistical analysis of the data indicated frog swarms are unlikely to be connected with earthquakes. Reports of unusual behaviour giving rise to earthquake fears should be interpreted with caution, and consultation with experts in the field of earthquake biology is advised. PMID:26479746

8. Earthquake damage orientation to infer seismic parameters in archaeological sites and historical earthquakes

NASA Astrophysics Data System (ADS)

Martín-González, Fidel

2018-01-01

Studies to provide information concerning seismic parameters and seismic sources of historical and archaeological seismic events are used to better evaluate the seismic hazard of a region. This is of especial interest when no surface rupture is recorded or the seismogenic fault cannot be identified. The orientation pattern of the earthquake damage (ED) (e.g., fallen columns, dropped key stones) that affected architectonic elements of cities after earthquakes has been traditionally used in historical and archaeoseismological studies to infer seismic parameters. However, in the literature depending on the authors, the parameters that can be obtained are contradictory (it has been proposed: the epicenter location, the orientation of the P-waves, the orientation of the compressional strain and the fault kinematics) and authors even question these relations with the earthquake damage. The earthquakes of Lorca in 2011, Christchurch in 2011 and Emilia Romagna in 2012 present an opportunity to measure systematically a large number and wide variety of earthquake damage in historical buildings (the same structures that are used in historical and archaeological studies). The damage pattern orientation has been compared with modern instrumental data, which is not possible in historical and archaeoseismological studies. From measurements and quantification of the orientation patterns in the studied earthquakes, it is observed that there is a systematic pattern of the earthquake damage orientation (EDO) in the proximity of the seismic source (fault trace) (<10 km). The EDO in these earthquakes is normal to the fault trend (±15°). This orientation can be generated by a pulse of motion that in the near fault region has a distinguishable acceleration normal to the fault due to the polarization of the S-waves. Therefore, the earthquake damage orientation could be used to estimate the seismogenic fault trend of historical earthquakes studies where no instrumental data are available.

9. The Rotational and Gravitational Effect of Earthquakes

NASA Technical Reports Server (NTRS)

Gross, Richard

2000-01-01

The static displacement field generated by an earthquake has the effect of rearranging the Earth's mass distribution and will consequently cause the Earth's rotation and gravitational field to change. Although the coseismic effect of earthquakes on the Earth's rotation and gravitational field have been modeled in the past, no unambiguous observations of this effect have yet been made. However, the Gravity Recovery And Climate Experiment (GRACE) satellite, which is scheduled to be launched in 2001, will measure time variations of the Earth's gravitational field to high degree and order with unprecedented accuracy. In this presentation, the modeled coseismic effect of earthquakes upon the Earth's gravitational field to degree and order 100 will be computed and compared to the expected accuracy of the GRACE measurements. In addition, the modeled second degree changes, corresponding to changes in the Earth's rotation, will be compared to length-of-day and polar motion excitation observations.

10. OMG Earthquake! Can Twitter improve earthquake response?

NASA Astrophysics Data System (ADS)

Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

2009-12-01

The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

11. Investigating Earthquake-induced LandslidesÂ­a Historical Review

NASA Astrophysics Data System (ADS)

Keefer, D. K.; Geological Survey, Us; Park, Menlo; Usa, Ca

, extensive to relatively complete inventories landslides have been prepared for a relatively small number of earthquakes. Through the 1960's and 1970's the best landslide inventories typically were complete only for a central affected area, although the first virtually complete inventory of a large earthquake was prepared for the M 7.6 Guatemala earthquake in 1976. Beginning in 1980, virtu- ally complete landslide inventories have prepared for several additional earthquakes in California, El Salvador, Japan, Italy, and Taiwan. Most of these used aerial pho- tography in combination with ground field studies, although the studies of the most recent of these events, in Taiwan, have also used satellite imagery, and three of the others (including the two smallest) were compiled largely from ground-based field 1 studies without aerial photography. Since 1989, digital mapping and GIS techniques have come into common use for mapping earthquake-induced landslides, and the use of these techniques has greatly enhanced the level of analysis that can be applied to earthquake-induced landslide occurrence. The first synthesis of data on earthquake- induced landslides, completed in 1984, defined the general characteristics of these landslides, derived relations between landslide occurrence on the one hand and geo- logic and seismic parameters on the other hand, and identified the types of hazards as- sociated with them. Since then, additional synthesis of worldwide data (1999) and na- tional data from New Zealand (1997), Greece (2000), and Italy (2000) have provided additional data on landslide characteristics and hazards and have extended, revised, and refined these relations. Recently completed studies have also identified areas with anomalous landslide distributions, have provided data for correlating the occurrence of landslides with a measure of local ground motion, have verified the occasional delayed triggering of landslides as a consequence of seismic shaking, and have identi- fied

12. Using remote sensing to predict earthquake impacts

NASA Astrophysics Data System (ADS)

Fylaktos, Asimakis; Yfantidou, Anastasia

2017-09-01

Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

13. Intermediate-depth earthquakes facilitated by eclogitization-related stresses

Nakajima, Junichi; Uchida, Naoki; Shiina, Takahiro; Hasegawa, Akira; Hacker, Bradley R.; Kirby, Stephen H.

2013-01-01

Eclogitization of the basaltic and gabbroic layer in the oceanic crust involves a volume reduction of 10%–15%. One consequence of the negative volume change is the formation of a paired stress field as a result of strain compatibility across the reaction front. Here we use waveform analysis of a tiny seismic cluster in the lower crust of the downgoing Pacific plate and reveal new evidence in favor of this mechanism: tensional earthquakes lying 1 km above compressional earthquakes, and earthquakes with highly similar waveforms lying on well-defined planes with complementary rupture areas. The tensional stress is interpreted to be caused by the dimensional mismatch between crust transformed to eclogite and underlying untransformed crust, and the earthquakes are probably facilitated by reactivation of fossil faults extant in the subducting plate. These observations provide seismic evidence for the role of volume change–related stresses and, possibly, fluid-related embrittlement as viable processes for nucleating earthquakes in downgoing oceanic lithosphere.

14. An empirical model for global earthquake fatality estimation

Jaiswal, Kishor; Wald, David

2010-01-01

We analyzed mortality rates of earthquakes worldwide and developed a country/region-specific empirical model for earthquake fatality estimation within the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is defined as total killed divided by total population exposed at specific shaking intensity level. The total fatalities for a given earthquake are estimated by multiplying the number of people exposed at each shaking intensity level by the fatality rates for that level and then summing them at all relevant shaking intensities. The fatality rate is expressed in terms of a two-parameter lognormal cumulative distribution function of shaking intensity. The parameters are obtained for each country or a region by minimizing the residual error in hindcasting the total shaking-related deaths from earthquakes recorded between 1973 and 2007. A new global regionalization scheme is used to combine the fatality data across different countries with similar vulnerability traits.

15. The Application of Speaker Recognition Techniques in the Detection of Tsunamigenic Earthquakes

NASA Astrophysics Data System (ADS)

Gorbatov, A.; O'Connell, J.; Paliwal, K.

2015-12-01

Tsunami warning procedures adopted by national tsunami warning centres largely rely on the classical approach of earthquake location, magnitude determination, and the consequent modelling of tsunami waves. Although this approach is based on known physics theories of earthquake and tsunami generation processes, this may be the main shortcoming due to the need to satisfy minimum seismic data requirement to estimate those physical parameters. At least four seismic stations are necessary to locate the earthquake and a minimum of approximately 10 minutes of seismic waveform observation to reliably estimate the magnitude of a large earthquake similar to the 2004 Indian Ocean Tsunami Earthquake of M9.2. Consequently the total time to tsunami warning could be more than half an hour. In attempt to reduce the time of tsunami alert a new approach is proposed based on the classification of tsunamigenic and non tsunamigenic earthquakes using speaker recognition techniques. A Tsunamigenic Dataset (TGDS) was compiled to promote the development of machine learning techniques for application to seismic trace analysis and, in particular, tsunamigenic event detection, and compare them to existing seismological methods. The TGDS contains 227 off shore events (87 tsunamigenic and 140 non-tsunamigenic earthquakes with M≥6) from Jan 2000 to Dec 2011, inclusive. A Support Vector Machine classifier using a radial-basis function kernel was applied to spectral features derived from 400 sec frames of 3-comp. 1-Hz broadband seismometer data. Ten-fold cross-validation was used during training to choose classifier parameters. Voting was applied to the classifier predictions provided from each station to form an overall prediction for an event. The F1 score (harmonic mean of precision and recall) was chosen to rate each classifier as it provides a compromise between type-I and type-II errors, and due to the imbalance between the representative number of events in the tsunamigenic and non

16. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

NASA Astrophysics Data System (ADS)

Yao, Y. B.; Chen, P.; Zhang, S.; Chen, J. J.; Yan, F.; Peng, W. F.

2012-03-01

The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC) from the global ionosphere map (GIM). We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0-2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time). Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

17. Polarized Politics and Policy Consequences

DTIC Science & Technology

2007-01-01

consequences with regard to policymaking process and outcomes . Alongside the study of consequences , more research is needed regard- ing institutional reforms...research organization providing objective analysis and effective solutions that address the challenges facing the public and private sectors...SUBTITLE Polarized politics and policy consequences 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Diana Epstein

18. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

PubMed

Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

2016-01-01

A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

19. Demand surge following earthquakes

Olsen, Anna H.

2012-01-01

Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

20. Geoelectric precursors to strong earthquakes in China

NASA Astrophysics Data System (ADS)

Yulin, Zhao; Fuye, Qian

1994-05-01

The main results of searching for electrical precursors to strong earthquakes in China for the last 25 yr are presented. This comprises: the continuous twenty-year resistivity record before and after the great Tangshan earthquake of 1976; spatial and temporal variations in resistivity anomalies observed at more than 6 stations within 150 km of the Tangshan earthquake epicenter; the travel-time curve for the front of the resistivity precursor; and a method of intersection for predicting the epicenter location. These results reveal a number of interesting facts: (1) Resistivity measurements with accuracies of 0.5% or better for over 20 yr show that resistivity decreases of several percent, which began approximately 3 yr prior to the Tangshan earthquake, were larger than the background fluctuations and hence statistically significant. An outstanding example of an intermediate-term resistivity precursor is given. (2) The intermediate-term resistivity precursor decrease before Tangshan earthquake is such a pervasive phenomenon that the mean decrease, in percent, can be contoured on a map of the Beijing-Tianjin-Tangshan region. This shows the maximum decrease centered over the epicenter. (3) The anomalies in resistivity and self-potential, which began 2-0.5 months before the Tangshan main shock, had periods equal to that of the tidal waves M 2 and MS f, respectively, so that the associated anomalies can be identified as impending-earthquake precursors and a modal related to stress-displacement weakening is proposed.

1. One research from turkey on groundwater- level changes related earthquake

NASA Astrophysics Data System (ADS)

Kirmizitas, H.; Göktepe, G.

2003-04-01

Groundwater levels are recorded by limnigraphs in drilling wells in order to determine groundwater potential accurately and reliable under hydrogeological studies in Turkey State Haydraulic Works (DSI) set the limnigraphs to estimate mainly groundwater potential. Any well is drilled to determine and to obtain data on water level changes related earthquake up today. The main purpose of these studies are based on groundwater potential and to expose the hydrodynamic structure of an aquifer. In this study, abnormal oscillations, water rising and water drops were observed on graphs which is related with water level changes in groundwater. These observations showed that, some earthquakes has been effective on water level changes. There is a distance ranging to 2000 km between this epicentral and water wells. Water level changes occur in groundwater bearing layers that could be consisting of grained materials such as, alluvium or consolidated rocks such as, limestones. The biggest water level change is ranging to 1,48 m on diagrams and it is recorded as oscillation movement. Water level changes related earthquake are observed in different types of movements below in this research. 1-Rise-drop oscillation changes on same point. 2-Water level drop in certain periods or permanent periods after earthquakes. 3-Water level rise in certain periods or permanent periods after earthquakes. (For example, during Gölcük Earthquake with magnitude of 7.8 on August, 17, 1999 one artesian occured in DSI well ( 49160 numbered ) in Adapazari, Dernekkiri Village. Groundwater level changes might easily be changed because of atmosferic pressure that comes in first range, precipitation, irrigation or water pumping. Owing to relate groundwater level changes with earthquake on any time, such changes should be observed accurately, carefully and at right time. Thus, first of all, the real reason of this water level changes must be determined From 1970 to 2001 many earthquakes occured in Turkey

2. Nuclear explosions and distant earthquakes: A search for correlations

Healy, J.H.; Marshall, P.A.

1970-01-01

An apparent correlation between nuclear explosions and earthquakes has been reported for the events between September 1961 and September 1966. When data from the events between September 1966 and December 1968 are examined, this correlation disappears. No relationship between the size of the nuclear explosions and the number of distant earthquakes is apparent in the data.

3. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Strong Ground Motion

Borcherdt, Roger D.

1994-01-01

Strong ground motion generated by the Loma Prieta, Calif., earthquake (MS~7.1) of October 17, 1989, resulted in at least 63 deaths, more than 3,757 injuries, and damage estimated to exceed \$5.9 billion. Strong ground motion severely damaged critical lifelines (freeway overpasses, bridges, and pipelines), caused severe damage to poorly constructed buildings, and induced a significant number of ground failures associated with liquefaction and landsliding. It also caused a significant proportion of the damage and loss of life at distances as far as 100 km from the epicenter. Consequently, understanding the characteristics of the strong ground motion associated with the earthquake is fundamental to understanding the earthquake's devastating impact on society. The papers assembled in this chapter address this problem. Damage to vulnerable structures from the earthquake varied substantially with the distance from the causative fault and the type of underlying geologic deposits. Most of the damage and loss of life occurred in areas underlain by 'soft soil'. Quantifying these effects is important for understanding the tragic concentrations of damage in such areas as Santa Cruz and the Marina and Embarcadero Districts of San Francisco, and the failures of the San Francisco-Oakland Bay Bridge and the Interstate Highway 880 overpass. Most importantly, understanding these effects is a necessary prerequisite for improving mitigation measures for larger earthquakes likely to occur much closer to densely urbanized areas in the San Francisco Bay region. The earthquake generated an especially important data set for understanding variations in the severity of strong ground motion. Instrumental strong-motion recordings were obtained at 131 sites located from about 6 to 175 km from the rupture zone. This set of recordings, the largest yet collected for an event of this size, was obtained from sites on various geologic deposits, including a unique set on 'soft soil' deposits

4. The Christchurch earthquake stroke incidence study.

PubMed

Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

2014-03-01

We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

5. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

NASA Astrophysics Data System (ADS)

Durukal, E.; Sesetyan, K.; Erdik, M.

2009-04-01

The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

6. Investigating landslides caused by earthquakes - A historical review

Keefer, D.K.

2002-01-01

Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

7. Regional and Local Glacial-Earthquake Patterns in Greenland

NASA Astrophysics Data System (ADS)

Olsen, K.; Nettles, M.

2016-12-01

Icebergs calved from marine-terminating glaciers currently account for up to half of the 400 Gt of ice lost annually from the Greenland ice sheet (Enderlin et al., 2014). When large capsizing icebergs ( 1 Gt of ice) calve, they produce elastic waves that propagate through the solid earth and are observed as teleseismically detectable MSW 5 glacial earthquakes (e.g., Ekström et al., 2003; Nettles & Ekström, 2010 Tsai & Ekström, 2007; Veitch & Nettles, 2012). The annual number of these events has increased dramatically over the past two decades. We analyze glacial earthquakes from 2011-2013, which expands the glacial-earthquake catalog by 50%. The number of glacial-earthquake solutions now available allows us to investigate regional patterns across Greenland and link earthquake characteristics to changes in ice dynamics at individual glaciers. During the years of our study Greenland's west coast dominated glacial-earthquake production. Kong Oscar Glacier, Upernavik Isstrøm, and Jakobshavn Isbræ all produced more glacial earthquakes during this time than in preceding years. We link patterns in glacial-earthquake production and cessation to the presence or absence of floating ice tongues at glaciers on both coasts of Greenland. The calving model predicts glacial-earthquake force azimuths oriented perpendicular to the calving front, and comparisons between seismic data and satellite imagery confirm this in most instances. At two glaciers we document force azimuths that have recently changed orientation and confirm that similar changes have occurred in the calving-front geometry. We also document glacial earthquakes at one previously quiescent glacier. Consistent with previous work, we model the glacial-earthquake force-time function as a boxcar with horizontal and vertical force components that vary synchronously. We investigate limitations of this approach and explore improvements that could lead to a more accurate representation of the glacial earthquake source.

8. Preliminary results on earthquake triggered landslides for the Haiti earthquake (January 2010)

NASA Astrophysics Data System (ADS)

van Westen, Cees; Gorum, Tolga

2010-05-01

This study presents the first results on an analysis of the landslides triggered by the Ms 7.0 Haiti earthquake that occurred on January 12, 2010 in the boundary region of the Pacific Plate and the North American plate. The fault is a left lateral strike slip fault with a clear surface expression. According to the USGS earthquake information the Enriquillo-Plantain Garden fault system has not produced any major earthquake in the last 100 years, and historical earthquakes are known from 1860, 1770, 1761, 1751, 1684, 1673, and 1618, though none of these has been confirmed in the field as associated with this fault. We used high resolution satellite imagery available for the pre and post earthquake situations, which were made freely available for the response and rescue operations. We made an interpretation of all co-seismic landslides in the epicentral area. We conclude that the earthquake mainly triggered landslide in the northern slope of the fault-related valley and in a number of isolated area. The earthquake apparently didn't trigger many visible landslides within the slum areas on the slopes in the southern part of Port-au-Prince and Carrefour. We also used ASTER DEM information to relate the landslide occurrences with DEM derivatives.

9. Local earthquake tomography of Scotland

NASA Astrophysics Data System (ADS)

Luckett, Richard; Baptie, Brian

2015-03-01

Scotland is a relatively aseismic region for the use of local earthquake tomography, but 40 yr of earthquakes recorded by a good and growing network make it possible. A careful selection is made from the earthquakes located by the British Geological Survey (BGS) over the last four decades to provide a data set maximising arrival time accuracy and ray path coverage of Scotland. A large number of 1-D velocity models with different layer geometries are considered and differentiated by employing quarry blasts as ground-truth events. Then, SIMULPS14 is used to produce a robust 3-D tomographic P-wave velocity model for Scotland. In areas of high resolution the model shows good agreement with previously published interpretations of seismic refraction and reflection experiments. However, the model shows relatively little lateral variation in seismic velocity except at shallow depths, where sedimentary basins such as the Midland Valley are apparent. At greater depths, higher velocities in the northwest parts of the model suggest that the thickness of crust increases towards the south and east. This observation is also in agreement with previous studies. Quarry blasts used as ground truth events and relocated with the preferred 3-D model are shown to be markedly more accurate than when located with the existing BGS 1-D velocity model.

10. Earthquake Clustering in Noisy Viscoelastic Systems

NASA Astrophysics Data System (ADS)

Dicaprio, C. J.; Simons, M.; Williams, C. A.; Kenner, S. J.

2006-12-01

Geologic studies show evidence for temporal clustering of earthquakes on certain fault systems. Since post- seismic deformation may result in a variable loading rate on a fault throughout the inter-seismic period, it is reasonable to expect that the rheology of the non-seismogenic lower crust and mantle lithosphere may play a role in controlling earthquake recurrence times. Previously, the role of rheology of the lithosphere on the seismic cycle had been studied with a one-dimensional spring-dashpot-slider model (Kenner and Simons [2005]). In this study we use the finite element code PyLith to construct a two-dimensional continuum model a strike-slip fault in an elastic medium overlying one or more linear Maxwell viscoelastic layers loaded in the far field by a constant velocity boundary condition. Taking advantage of the linear properties of the model, we use the finite element solution to one earthquake as a spatio-temporal Green's function. Multiple Green's function solutions, scaled by the size of each earthquake, are then summed to form an earthquake sequence. When the shear stress on the fault reaches a predefined yield stress it is allowed to slip, relieving all accumulated shear stress. Random variation in the fault yield stress from one earthquake to the next results in a temporally clustered earthquake sequence. The amount of clustering depends on a non-dimensional number, W, called the Wallace number. For models with one viscoelastic layer, W is equal to the standard deviation of the earthquake stress drop divided by the viscosity times the tectonic loading rate. This definition of W is modified from the original one used in Kenner and Simons [2005] by using the standard deviation of the stress drop instead of the mean stress drop. We also use a new, more appropriate, metric to measure the amount of temporal clustering of the system. W is the ratio of the viscoelastic relaxation rate of the system to the tectonic loading rate of the system. For values of

11. Earthquakes; July-August, 1978

Person, W.J.

1979-01-01

Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California.

12. Earthquakes, September-October 1986

Person, W.J.

1987-01-01

There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

13. Sun, Moon and Earthquakes

NASA Astrophysics Data System (ADS)

Kolvankar, V. G.

2013-12-01

During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

14. NRIAG's Effort to Mitigate Earthquake Disasters in Egypt Using GPS and Seismic Data

NASA Astrophysics Data System (ADS)

Mahmoud, Salah

It has been estimated that, during historical time more than 50 million people have lost their lives in earthquakes during ground shaking, such as soil amplification and/or liquefaction, landslides and tsunamis or its immediate aftereffects, as fires. The distribution of population takes generally no account of earthquake risk, at least on a large scale. An earthquake may be large but not destructive, on the other hand, an earthquake may be destructive but not large. The absence of correlation is due to the fact that, great number of other factors entering into consideration: first of all, the location of the earthquake in relation to populated areas, also soil conditions and building constructions. Soil liquefaction has been identified as the underlying phenomenon for many ground failures, settlements and lateral spreads, which are a major cause of damage to soil structures and building foundations in many events. Egypt is suffered a numerous of destructive earthquakes as well as Kalabsha earthquake (1981, Mag 5.4) near Aswan city and the High dam, Dahshour earthquake (1992, Mag 5.9) near Cairo city and Aqaba earthquake (1995, Mag 7.2). As the category of earthquake damage includes all the phenomena related to the direct and indirect damages, the Egyptian authorities do a great effort to mitigate the earthquake disasters. The seismicity especially at the zones of high activity is investigated in details in order to obtain the active source zones not only by the Egyptian National Seismic Network (ENSN) but also by the local seismic networks at, Aswan, Hurghada, Aqaba, Abu Dabbab and Dabbaa. On the other hand the soil condition, soil amplification, soil structure interaction, liquefaction and seismic hazard are carried out in particular the urbanized areas and the region near the source zones. All these parameters are integrated to obtain the Egyptian building code which is valid to construct buildings resist damages and consequently mitigate the earthquake

15. Toward standardization of slow earthquake catalog -Development of database website-

NASA Astrophysics Data System (ADS)

Kano, M.; Aso, N.; Annoura, S.; Arai, R.; Ito, Y.; Kamaya, N.; Maury, J.; Nakamura, M.; Nishimura, T.; Obana, K.; Sugioka, H.; Takagi, R.; Takahashi, T.; Takeo, A.; Yamashita, Y.; Matsuzawa, T.; Ide, S.; Obara, K.

2017-12-01

work is supported by JSPS KAKENHI Grant Numbers JP16H06472, JP16H06473, JP16H06474, JP16H06477 in Scientific Research on Innovative Areas "Science of Slow Earthquakes", and JP15K17743 in Grant-in-Aid for Young Scientists (B).

16. Earthquakes and emergence

NASA Astrophysics Data System (ADS)

Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

17. Real-Time Earthquake Intensity Estimation Using Streaming Data Analysis of Social and Physical Sensors

NASA Astrophysics Data System (ADS)

Kropivnitskaya, Yelena; Tiampo, Kristy F.; Qin, Jinhui; Bauer, Michael A.

2017-06-01

Earthquake intensity is one of the key components of the decision-making process for disaster response and emergency services. Accurate and rapid intensity calculations can help to reduce total loss and the number of casualties after an earthquake. Modern intensity assessment procedures handle a variety of information sources, which can be divided into two main categories. The first type of data is that derived from physical sensors, such as seismographs and accelerometers, while the second type consists of data obtained from social sensors, such as witness observations of the consequences of the earthquake itself. Estimation approaches using additional data sources or that combine sources from both data types tend to increase intensity uncertainty due to human factors and inadequate procedures for temporal and spatial estimation, resulting in precision errors in both time and space. Here we present a processing approach for the real-time analysis of streams of data from both source types. The physical sensor data is acquired from the U.S. Geological Survey (USGS) seismic network in California and the social sensor data is based on Twitter user observations. First, empirical relationships between tweet rate and observed Modified Mercalli Intensity (MMI) are developed using data from the M6.0 South Napa, CAF earthquake that occurred on August 24, 2014. Second, the streams of both data types are analyzed together in simulated real-time to produce one intensity map. The second implementation is based on IBM InfoSphere Streams, a cloud platform for real-time analytics of big data. To handle large processing workloads for data from various sources, it is deployed and run on a cloud-based cluster of virtual machines. We compare the quality and evolution of intensity maps from different data sources over 10-min time intervals immediately following the earthquake. Results from the joint analysis shows that it provides more complete coverage, with better accuracy and higher

18. 2016 update on induced earthquakes in the United States

Petersen, Mark D.

2016-01-01

During the past decade people living in numerous locations across the central U.S. experienced many more small to moderate sized earthquakes than ever before. This earthquake activity began increasing about 2009 and peaked during 2015 and into early 2016. For example, prior to 2009 Oklahoma typically experienced 1 or 2 small earthquakes per year with magnitude greater than 3.0 but by 2015 this number rose to over 900 earthquakes per year of that size and over 30 earthquakes greater than 4.0. These earthquakes can cause damage. In 2011 a magnitude 5.6 earthquake struck near the town of Prague, Oklahoma on a preexisting fault and caused severe damage to several houses and school buildings. During the past 6 years more than 1500 reports of damaging shaking levels were reported in areas of induced seismicity. This rapid increase and the potential for damaging ground shaking from induced earthquakes caused alarm to about 8 million people living nearby and officials responsible for public safety. They wanted to understand why earthquakes were increasing and the potential threats to society and buildings located nearby.

19. Populating the Advanced National Seismic System Comprehensive Earthquake Catalog

NASA Astrophysics Data System (ADS)

Earle, P. S.; Perry, M. R.; Andrews, J. R.; Withers, M. M.; Hellweg, M.; Kim, W. Y.; Shiro, B.; West, M. E.; Storchak, D. A.; Pankow, K. L.; Huerfano Moreno, V. A.; Gee, L. S.; Wolfe, C. J.

2016-12-01

The U.S. Geological Survey maintains a repository of earthquake information produced by networks in the Advanced National Seismic System with additional data from the ISC-GEM catalog and many non-U.S. networks through their contributions to the National Earthquake Information Center PDE bulletin. This Comprehensive Catalog (ComCat) provides a unified earthquake product while preserving attribution and contributor information. ComCat contains hypocenter and magnitude information with supporting phase arrival-time and amplitude measurements (when available). Higher-level products such as focal mechanisms, earthquake slip models, "Did You Feel It?" reports, ShakeMaps, PAGER impact estimates, earthquake summary posters, and tectonic summaries are also included. ComCat is updated as new events are processed and the catalog can be accesed at http://earthquake.usgs.gov/earthquakes/search/. Throughout the past few years, a concentrated effort has been underway to expand ComCat by integrating global and regional historic catalogs. The number of earthquakes in ComCat has more than doubled in the past year and it presently contains over 1.6 million earthquake hypocenters. We will provide an overview of catalog contents and a detailed description of numerous tools and semi-automated quality-control procedures developed to uncover errors including systematic magnitude biases, missing time periods, duplicate postings for the same events, and incorrectly associated events.

20. Earthquake Ground Motion Selection

DOT National Transportation Integrated Search

2012-05-01

Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

1. Investigating Landslides Caused by Earthquakes A Historical Review

NASA Astrophysics Data System (ADS)

Keefer, David K.

Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

2. Pre-Earthquake Unipolar Electromagnetic Pulses

NASA Astrophysics Data System (ADS)

Scoville, J.; Freund, F.

2013-12-01

Transient ultralow frequency (ULF) electromagnetic (EM) emissions have been reported to occur before earthquakes [1,2]. They suggest powerful transient electric currents flowing deep in the crust [3,4]. Prior to the M=5.4 Alum Rock earthquake of Oct. 21, 2007 in California a QuakeFinder triaxial search-coil magnetometer located about 2 km from the epicenter recorded unusual unipolar pulses with the approximate shape of a half-cycle of a sine wave, reaching amplitudes up to 30 nT. The number of these unipolar pulses increased as the day of the earthquake approached. These pulses clearly originated around the hypocenter. The same pulses have since been recorded prior to several medium to moderate earthquakes in Peru, where they have been used to triangulate the location of the impending earthquakes [5]. To understand the mechanism of the unipolar pulses, we first have to address the question how single current pulses can be generated deep in the Earth's crust. Key to this question appears to be the break-up of peroxy defects in the rocks in the hypocenter as a result of the increase in tectonic stresses prior to an earthquake. We investigate the mechanism of the unipolar pulses by coupling the drift-diffusion model of semiconductor theory to Maxwell's equations, thereby producing a model describing the rock volume that generates the pulses in terms of electromagnetism and semiconductor physics. The system of equations is then solved numerically to explore the electromagnetic radiation associated with drift-diffusion currents of electron-hole pairs. [1] Sharma, A. K., P. A. V., and R. N. Haridas (2011), Investigation of ULF magnetic anomaly before moderate earthquakes, Exploration Geophysics 43, 36-46. [2] Hayakawa, M., Y. Hobara, K. Ohta, and K. Hattori (2011), The ultra-low-frequency magnetic disturbances associated with earthquakes, Earthquake Science, 24, 523-534. [3] Bortnik, J., T. E. Bleier, C. Dunson, and F. Freund (2010), Estimating the seismotelluric current

3. Becoming an Officer of Consequence

DTIC Science & Technology

2007-01-01

ndupress .ndu.edu   issue 44, 1st quarter 2007  /  JFQ        6 Becoming an officer of Consequence m uch of the literature about military history...commander become officers of consequence because their commanders value their judgment and seek their counsel when making difficult choices...COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE Becoming an Officer of Consequence 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

4. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

PubMed

Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

2016-05-10

We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

5. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

PubMed Central

Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

2016-01-01

We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

6. Earthquake education in California

MacCabe, M. P.

1980-01-01

In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern.

7. Fractals and Forecasting in Earthquakes and Finance

NASA Astrophysics Data System (ADS)

Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.

2011-12-01

It is now recognized that Benoit Mandelbrot's fractals play a critical role in describing a vast range of physical and social phenomena. Here we focus on two systems, earthquakes and finance. Since 1942, earthquakes have been characterized by the Gutenberg-Richter magnitude-frequency relation, which in more recent times is often written as a moment-frequency power law. A similar relation can be shown to hold for financial markets. Moreover, a recent New York Times article, titled "A Richter Scale for the Markets" [1] summarized the emerging viewpoint that stock market crashes can be described with similar ideas as large and great earthquakes. The idea that stock market crashes can be related in any way to earthquake phenomena has its roots in Mandelbrot's 1963 work on speculative prices in commodities markets such as cotton [2]. He pointed out that Gaussian statistics did not account for the excessive number of booms and busts that characterize such markets. Here we show that both earthquakes and financial crashes can both be described by a common Landau-Ginzburg-type free energy model, involving the presence of a classical limit of stability, or spinodal. These metastable systems are characterized by fractal statistics near the spinodal. For earthquakes, the independent ("order") parameter is the slip deficit along a fault, whereas for the financial markets, it is financial leverage in place. For financial markets, asset values play the role of a free energy. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In the case of financial models, the probabilities are closely related to implied volatility, an important component of Black-Scholes models for stock valuations. [2] B. Mandelbrot, The variation of certain speculative prices, J. Business, 36, 294 (1963)

8. Injection-induced earthquakes

Ellsworth, William L.

2013-01-01

Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

9. Charles Darwin's earthquake reports

NASA Astrophysics Data System (ADS)

Galiev, Shamil

2010-05-01

As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

10. Simulating Earthquake Early Warning Systems in the Classroom as a New Approach to Teaching Earthquakes

NASA Astrophysics Data System (ADS)

D'Alessio, M. A.

2010-12-01

A discussion of P- and S-waves seems an ubiquitous part of studying earthquakes in the classroom. Textbooks from middle school through university level typically define the differences between the waves and illustrate the sense of motion. While many students successfully memorize the differences between wave types (often utilizing the first letter as a memory aide), textbooks rarely give tangible examples of how the two waves would "feel" to a person sitting on the ground. One reason for introducing the wave types is to explain how to calculate earthquake epicenters using seismograms and travel time charts -- very abstract representations of earthquakes. Even when the skill is mastered using paper-and-pencil activities or one of the excellent online interactive versions, locating an epicenter simply does not excite many of our students because it evokes little emotional impact, even in students located in earthquake-prone areas. Despite these limitations, huge numbers of students are mandated to complete the task. At the K-12 level, California requires that all students be able to locate earthquake epicenters in Grade 6; in New York, the skill is a required part of the Regent's Examination. Recent innovations in earthquake early warning systems around the globe give us the opportunity to address the same content standard, but with substantially more emotional impact on students. I outline a lesson about earthquakes focused on earthquake early warning systems. The introductory activities include video clips of actual earthquakes and emphasize the differences between the way P- and S-waves feel when they arrive (P arrives first, but is weaker). I include an introduction to the principle behind earthquake early warning (including a summary of possible uses of a few seconds warning about strong shaking) and show examples from Japan. Students go outdoors to simulate P-waves, S-waves, and occupants of two different cities who are talking to one another on cell phones

11. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

NASA Astrophysics Data System (ADS)

Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

2014-05-01

In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

12. Nowcasting Earthquakes and Tsunamis

NASA Astrophysics Data System (ADS)

Rundle, J. B.; Turcotte, D. L.

2017-12-01

The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

13. An evaluation of Health of the Nation Outcome Scales data to inform psychiatric morbidity following the Canterbury earthquakes.

PubMed

Beaglehole, Ben; Frampton, Chris M; Boden, Joseph M; Mulder, Roger T; Bell, Caroline J

2017-11-01

Following the onset of the Canterbury, New Zealand earthquakes, there were widespread concerns that mental health services were under severe strain as a result of adverse consequences on mental health. We therefore examined Health of the Nation Outcome Scales data to see whether this could inform our understanding of the impact of the Canterbury earthquakes on patients attending local specialist mental health services. Health of the Nation Outcome Scales admission data were analysed for Canterbury mental health services prior to and following the Canterbury earthquakes. These findings were compared to Health of the Nation Outcome Scales admission data from seven other large District Health Boards to delineate local from national trends. Percentage changes in admission numbers were also calculated before and after the earthquakes for Canterbury and the seven other large district health boards. Admission Health of the Nation Outcome Scales scores in Canterbury increased after the earthquakes for adult inpatient and community services, old age inpatient and community services, and Child and Adolescent inpatient services compared to the seven other large district health boards. Admission Health of the Nation Outcome Scales scores for Child and Adolescent community services did not change significantly, while admission Health of the Nation Outcome Scales scores for Alcohol and Drug services in Canterbury fell compared to other large district health boards. Subscale analysis showed that the majority of Health of the Nation Outcome Scales subscales contributed to the overall increases found. Percentage changes in admission numbers for the Canterbury District Health Board and the seven other large district health boards before and after the earthquakes were largely comparable with the exception of admissions to inpatient services for the group aged 4-17 years which showed a large increase. The Canterbury earthquakes were followed by an increase in Health of the Nation

14. Computing Earthquake Probabilities on Global Scales

NASA Astrophysics Data System (ADS)

Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

2016-03-01

Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

15. Extending earthquakes' reach through cascading.

PubMed

Marsan, David; Lengliné, Olivier

2008-02-22

Earthquakes, whatever their size, can trigger other earthquakes. Mainshocks cause aftershocks to occur, which in turn activate their own local aftershock sequences, resulting in a cascade of triggering that extends the reach of the initial mainshock. A long-lasting difficulty is to determine which earthquakes are connected, either directly or indirectly. Here we show that this causal structure can be found probabilistically, with no a priori model nor parameterization. Large regional earthquakes are found to have a short direct influence in comparison to the overall aftershock sequence duration. Relative to these large mainshocks, small earthquakes collectively have a greater effect on triggering. Hence, cascade triggering is a key component in earthquake interactions.

16. Comparison of aftershock sequences between 1975 Haicheng earthquake and 1976 Tangshan earthquake

NASA Astrophysics Data System (ADS)

Liu, B.

2017-12-01

The 1975 ML 7.3 Haicheng earthquake and the 1976 ML 7.8 Tangshan earthquake occurred in the same tectonic unit. There are significant differences in spatial-temporal distribution, number of aftershocks and time duration for the aftershock sequence followed by these two main shocks. As we all know, aftershocks could be triggered by the regional seismicity change derived from the main shock, which was caused by the Coulomb stress perturbation. Based on the rate- and state- dependent friction law, we quantitative estimated the possible aftershock time duration with a combination of seismicity data, and compared the results from different approaches. The results indicate that, aftershock time durations from the Tangshan main shock is several times of that form the Haicheng main shock. This can be explained by the significant relationship between aftershock time duration and earthquake nucleation history, normal stressand shear stress loading rateon the fault. In fact the obvious difference of earthquake nucleation history from these two main shocks is the foreshocks. 1975 Haicheng earthquake has clear and long foreshocks, while 1976 Tangshan earthquake did not have clear foreshocks. In that case, abundant foreshocks may mean a long and active nucleation process that may have changed (weakened) the rocks in the source regions, so they should have a shorter aftershock sequences for the reason that stress in weak rocks decay faster.

17. Earthquake triggering in southeast Africa following the 2012 Indian Ocean earthquake

NASA Astrophysics Data System (ADS)

Neves, Miguel; Custódio, Susana; Peng, Zhigang; Ayorinde, Adebayo

2018-02-01

In this paper we present evidence of earthquake dynamic triggering in southeast Africa. We analysed seismic waveforms recorded at 53 broad-band and short-period stations in order to identify possible increases in the rate of microearthquakes and tremor due to the passage of teleseismic waves generated by the Mw8.6 2012 Indian Ocean earthquake. We found evidence of triggered local earthquakes and no evidence of triggered tremor in the region. We assessed the statistical significance of the increase in the number of local earthquakes using β-statistics. Statistically significant dynamic triggering of local earthquakes was observed at 7 out of the 53 analysed stations. Two of these stations are located in the northeast coast of Madagascar and the other five stations are located in the Kaapvaal Craton, southern Africa. We found no evidence of dynamically triggered seismic activity in stations located near the structures of the East African Rift System. Hydrothermal activity exists close to the stations that recorded dynamic triggering, however, it also exists near the East African Rift System structures where no triggering was observed. Our results suggest that factors other than solely tectonic regime and geothermalism are needed to explain the mechanisms that underlie earthquake triggering.

18. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions

PubMed Central

Burro, Roberto; Hall, Rob

2017-01-01

A major earthquake has a potentially highly traumatic impact on children’s psychological functioning. However, while many studies on children describe negative consequences in terms of mental health and psychiatric disorders, little is known regarding how the developmental processes of emotions can be affected following exposure to disasters. Objectives We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children’s emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. Method The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children’s understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. Results We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Conclusions Our data extend the generalizability of theoretical models on children’s psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and

19. Thermal Radiation Anomalies Associated with Major Earthquakes

NASA Technical Reports Server (NTRS)

Ouzounov, Dimitar; Pulinets, Sergey; Kafatos, Menas C.; Taylor, Patrick

2017-01-01

Recent developments of remote sensing methods for Earth satellite data analysis contribute to our understanding of earthquake related thermal anomalies. It was realized that the thermal heat fluxes over areas of earthquake preparation is a result of air ionization by radon (and other gases) and consequent water vapor condensation on newly formed ions. Latent heat (LH) is released as a result of this process and leads to the formation of local thermal radiation anomalies (TRA) known as OLR (outgoing Longwave radiation, Ouzounov et al, 2007). We compare the LH energy, obtained by integrating surface latent heat flux (SLHF) over the area and time with released energies associated with these events. Extended studies of the TRA using the data from the most recent major earthquakes allowed establishing the main morphological features. It was also established that the TRA are the part of more complex chain of the short-term pre-earthquake generation, which is explained within the framework of a lithosphere-atmosphere coupling processes.

20. Impact of traumatic loss on post-traumatic spectrum symptoms in high school students after the L'Aquila 2009 earthquake in Italy.

PubMed

Dell'OSso, L; Carmassi, C; Massimetti, G; Conversano, C; Daneluzzo, E; Riccardi, I; Stratta, P; Rossi, A

2011-11-01

On April 6th 2009, the town of L'Aquila, Italy, was struck by an earthquake (6.3 on the Richter scale) that lead large parts of the town to be destroyed and the death of 309 people. Significant losses in the framework of earthquakes have been reported as a major risk factor for PTSD development. Aim of this study was to investigate post-traumatic spectrum symptoms in a sample of adolescents exposed to the L'Aquila 2009 earthquake 21 months earlier, with particular attention to the impact of loss. 475 students (203 women and 272 men), attending the last year of High School in L'Aquila, were assessed by: Trauma and Loss Spectrum-Self Report (TALS-SR) and Impact of Event Scale (IES). The presence of full and partial PTSD was also assessed. 72 students (15.2%) reported the loss of a close friend or relative in the framework of the earthquake. Full PTSD was reported by 146 (30.7%) students and partial PTSD by 149 (31.4%) students. There was a significant difference reported in PTSD between bereaved and non bereaved subjects. Significantly higher post-traumatic symptom levels were reported by bereaved subjects. The lack of information on the relationship with the deceased and the number of losses experienced, besides the use of self report instruments are the limitations of this study. Our results show high rates of post-traumatic spectrum symptoms in adolescents who survived the L'Aquila earthquake. Having experienced the loss of a close friend or a relative in the framework of the earthquake seems to be related to higher PTSD rates and more severe symptomatology. These results highlight the need to carefully explore adolescents exposed to a significant loss as consequence of an earthquake. Copyright © 2011 Elsevier B.V. All rights reserved.

1. Global Instrumental Seismic Catalog: earthquake relocations for 1900-present

NASA Astrophysics Data System (ADS)

Villasenor, A.; Engdahl, E.; Storchak, D. A.; Bondar, I.

2010-12-01

We present the current status of our efforts to produce a set of homogeneous earthquake locations and improved focal depths towards the compilation of a Global Catalog of instrumentally recorded earthquakes that will be complete down to the lowest magnitude threshold possible on a global scale and for the time period considered. This project is currently being carried out under the auspices of GEM (Global Earthquake Model). The resulting earthquake catalog will be a fundamental dataset not only for earthquake risk modeling and assessment on a global scale, but also for a large number of studies such as global and regional seismotectonics; the rupture zones and return time of large, damaging earthquakes; the spatial-temporal pattern of moment release along seismic zones and faults etc. Our current goal is to re-locate all earthquakes with available station arrival data using the following magnitude thresholds: M5.5 for 1964-present, M6.25 for 1918-1963, M7.5 (complemented with significant events in continental regions) for 1900-1917. Phase arrival time data for earthquakes after 1963 are available in digital form from the International Seismological Centre (ISC). For earthquakes in the time period 1918-1963, phase data is obtained by scanning the printed International Seismological Summary (ISS) bulletins and applying optical character recognition routines. For earlier earthquakes we will collect phase data from individual station bulletins. We will illustrate some of the most significant results of this relocation effort, including aftershock distributions for large earthquakes, systematic differences in epicenter and depth with respect to previous location, examples of grossly mislocated events, etc.

2. Acute Myocardial Infarction and Stress Cardiomyopathy following the Christchurch Earthquakes

PubMed Central

Chan, Christina; Elliott, John; Troughton, Richard; Frampton, Christopher; Smyth, David; Crozier, Ian; Bridgman, Paul

2013-01-01

Background Christchurch, New Zealand, was struck by 2 major earthquakes at 4:36am on 4 September 2010, magnitude 7.1 and at 12:51pm on 22 February 2011, magnitude 6.3. Both events caused widespread destruction. Christchurch Hospital was the region's only acute care hospital. It remained functional following both earthquakes. We were able to examine the effects of the 2 earthquakes on acute cardiac presentations. Methods Patients admitted under Cardiology in Christchurch Hospital 3 week prior to and 5 weeks following both earthquakes were analysed, with corresponding control periods in September 2009 and February 2010. Patients were categorised based on diagnosis: ST elevation myocardial infarction, Non ST elevation myocardial infarction, stress cardiomyopathy, unstable angina, stable angina, non cardiac chest pain, arrhythmia and others. Results There was a significant increase in overall admissions (p<0.003), ST elevation myocardial infarction (p<0.016), and non cardiac chest pain (p<0.022) in the first 2 weeks following the early morning September earthquake. This pattern was not seen after the early afternoon February earthquake. Instead, there was a very large number of stress cardiomyopathy admissions with 21 cases (95% CI 2.6–6.4) in 4 days. There had been 6 stress cardiomyopathy cases after the first earthquake (95% CI 0.44–2.62). Statistical analysis showed this to be a significant difference between the earthquakes (p<0.05). Conclusion The early morning September earthquake triggered a large increase in ST elevation myocardial infarction and a few stress cardiomyopathy cases. The early afternoon February earthquake caused significantly more stress cardiomyopathy. Two major earthquakes occurring at different times of day differed in their effect on acute cardiac events. PMID:23844213

3. Acute myocardial infarction and stress cardiomyopathy following the Christchurch earthquakes.

PubMed

Chan, Christina; Elliott, John; Troughton, Richard; Frampton, Christopher; Smyth, David; Crozier, Ian; Bridgman, Paul

2013-01-01

Christchurch, New Zealand, was struck by 2 major earthquakes at 4:36 am on 4 September 2010, magnitude 7.1 and at 12:51 pm on 22 February 2011, magnitude 6.3. Both events caused widespread destruction. Christchurch Hospital was the region's only acute care hospital. It remained functional following both earthquakes. We were able to examine the effects of the 2 earthquakes on acute cardiac presentations. Patients admitted under Cardiology in Christchurch Hospital 3 week prior to and 5 weeks following both earthquakes were analysed, with corresponding control periods in September 2009 and February 2010. Patients were categorised based on diagnosis: ST elevation myocardial infarction, Non ST elevation myocardial infarction, stress cardiomyopathy, unstable angina, stable angina, non cardiac chest pain, arrhythmia and others. There was a significant increase in overall admissions (p<0.003), ST elevation myocardial infarction (p<0.016), and non cardiac chest pain (p<0.022) in the first 2 weeks following the early morning September earthquake. This pattern was not seen after the early afternoon February earthquake. Instead, there was a very large number of stress cardiomyopathy admissions with 21 cases (95% CI 2.6-6.4) in 4 days. There had been 6 stress cardiomyopathy cases after the first earthquake (95% CI 0.44-2.62). Statistical analysis showed this to be a significant difference between the earthquakes (p<0.05). The early morning September earthquake triggered a large increase in ST elevation myocardial infarction and a few stress cardiomyopathy cases. The early afternoon February earthquake caused significantly more stress cardiomyopathy. Two major earthquakes occurring at different times of day differed in their effect on acute cardiac events.

4. Oklahoma’s recent earthquakes and saltwater disposal

PubMed Central

Walsh, F. Rall; Zoback, Mark D.

2015-01-01

Over the past 5 years, parts of Oklahoma have experienced marked increases in the number of small- to moderate-sized earthquakes. In three study areas that encompass the vast majority of the recent seismicity, we show that the increases in seismicity follow 5- to 10-fold increases in the rates of saltwater disposal. Adjacent areas where there has been relatively little saltwater disposal have had comparatively few recent earthquakes. In the areas of seismic activity, the saltwater disposal principally comes from “produced” water, saline pore water that is coproduced with oil and then injected into deeper sedimentary formations. These formations appear to be in hydraulic communication with potentially active faults in crystalline basement, where nearly all the earthquakes are occurring. Although most of the recent earthquakes have posed little danger to the public, the possibility of triggering damaging earthquakes on potentially active basement faults cannot be discounted. PMID:26601200

5. Strike-slip earthquakes can also be detected in the ionosphere

NASA Astrophysics Data System (ADS)

Astafyeva, Elvira; Rolland, Lucie M.; Sladen, Anthony

2014-11-01

It is generally assumed that co-seismic ionospheric disturbances are generated by large vertical static displacements of the ground during an earthquake. Consequently, it is expected that co-seismic ionospheric disturbances are only observable after earthquakes with a significant dip-slip component. Therefore, earthquakes dominated by strike-slip motion, i.e. with very little vertical co-seismic component, are not expected to generate ionospheric perturbations. In this work, we use total electron content (TEC) measurements from ground-based GNSS-receivers to study ionospheric response to six recent largest strike-slip earthquakes: the Mw7.8 Kunlun earthquake of 14 November 2001, the Mw8.1 Macquarie earthquake of 23 December 2004, the Sumatra earthquake doublet, Mw8.6 and Mw8.2, of 11 April 2012, the Mw7.7 Balochistan earthquake of 24 September 2013 and the Mw 7.7 Scotia Sea earthquake of 17 November 2013. We show that large strike-slip earthquakes generate large ionospheric perturbations of amplitude comparable with those induced by dip-slip earthquakes of equivalent magnitude. We consider that in the absence of significant vertical static co-seismic displacements of the ground, other seismological parameters (primarily the magnitude of co-seismic horizontal displacements, seismic fault dimensions, seismic slip) may contribute in generation of large-amplitude ionospheric perturbations.

6. Comparing methods for Earthquake Location

NASA Astrophysics Data System (ADS)

Turkaya, Semih; Bodin, Thomas; Sylvander, Matthieu; Parroucau, Pierre; Manchuel, Kevin

2017-04-01

There are plenty of methods available for locating small magnitude point source earthquakes. However, it is known that these different approaches produce different results. For each approach, results also depend on a number of parameters which can be separated into two main branches: (1) parameters related to observations (number and distribution of for example) and (2) parameters related to the inversion process (velocity model, weighting parameters, initial location etc.). Currently, the results obtained from most of the location methods do not systematically include quantitative uncertainties. The effect of the selected parameters on location uncertainties is also poorly known. Understanding the importance of these different parameters and their effect on uncertainties is clearly required to better constrained knowledge on fault geometry, seismotectonic processes and at the end to improve seismic hazard assessment. In this work, realized in the frame of the SINAPS@ research program (http://www.institut-seism.fr/projets/sinaps/), we analyse the effect of different parameters on earthquakes location (e.g. type of phase, max. hypocentral separation etc.). We compare several codes available (Hypo71, HypoDD, NonLinLoc etc.) and determine their strengths and weaknesses in different cases by means of synthetic tests. The work, performed for the moment on synthetic data, is planned to be applied, in a second step, on data collected by the Midi-Pyrénées Observatory (OMP).

7. Prevention of strong earthquakes: Goal or utopia?

NASA Astrophysics Data System (ADS)

Mukhamediev, Sh. A.

2010-11-01

In the present paper, we consider ideas suggesting various kinds of industrial impact on the close-to-failure block of the Earth’s crust in order to break a pending strong earthquake (PSE) into a number of smaller quakes or aseismic slips. Among the published proposals on the prevention of a forthcoming strong earthquake, methods based on water injection and vibro influence merit greater attention as they are based on field observations and the results of laboratory tests. In spite of this, the cited proofs are, for various reasons, insufficient to acknowledge the proposed techniques as highly substantiated; in addition, the physical essence of these methods has still not been fully understood. First, the key concept of the methods, namely, the release of the accumulated stresses (or excessive elastic energy) in the source region of a forthcoming strong earthquake, is open to objection. If we treat an earthquake as a phenomenon of a loss in stability, then, the heterogeneities of the physicomechanical properties and stresses along the existing fault or its future trajectory, rather than the absolute values of stresses, play the most important role. In the present paper, this statement is illustrated by the classical examples of stable and unstable fractures and by the examples of the calculated stress fields, which were realized in the source regions of the tsunamigenic earthquakes of December 26, 2004 near the Sumatra Island and of September 29, 2009 near the Samoa Island. Here, just before the earthquakes, there were no excessive stresses in the source regions. Quite the opposite, the maximum shear stresses τmax were close to their minimum value, compared to τmax in the adjacent territory. In the present paper, we provide quantitative examples that falsify the theory of the prevention of PSE in its current form. It is shown that the measures for the prevention of PSE, even when successful for an already existing fault, can trigger or accelerate a catastrophic

8. Earthquake impact scale

Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

2011-01-01

With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching \$1M, \$100M, and \$1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

9. Rupture, waves and earthquakes.

PubMed

Uenishi, Koji

2017-01-01

Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

10. Rupture, waves and earthquakes

PubMed Central

UENISHI, Koji

2017-01-01

Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808

11. Earth's rotation variations and earthquakes 2010-2011

NASA Astrophysics Data System (ADS)

Ostřihanský, L.

2012-01-01

19 years earlier in difference only one day to 27 December 1985 earthquake, proving that not only sidereal 13.66 days variations but also that the 19 years Metons cycle is the period of the earthquakes occurrence. Histograms show the regular change of earthquake positions on branches of LOD graph and also the shape of histogram and number of earthquakes on LOD branches from the mid-ocean ridge can show which side of the ridge moves quicker.

12. Rescaled earthquake recurrence time statistics: application to microrepeaters

NASA Astrophysics Data System (ADS)

Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru

2009-01-01

Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.

13. A prospective earthquake forecast experiment in the western Pacific

NASA Astrophysics Data System (ADS)

Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan

2012-09-01

Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.

14. An efficient repeating signal detector to investigate earthquake swarms

NASA Astrophysics Data System (ADS)

Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

2016-08-01

Repetitive earthquake swarms have been recognized as key signatures in fluid injection induced seismicity, precursors to volcanic eruptions, and slow slip events preceding megathrust earthquakes. We investigate earthquake swarms by developing a Repeating Signal Detector (RSD), a computationally efficient algorithm utilizing agglomerative clustering to identify similar waveforms buried in years of seismic recordings using a single seismometer. Instead of relying on existing earthquake catalogs of larger earthquakes, RSD identifies characteristic repetitive waveforms by rapidly identifying signals of interest above a low signal-to-noise ratio and then grouping based on spectral and time domain characteristics, resulting in dramatically shorter processing time than more exhaustive autocorrelation approaches. We investigate seismicity in four regions using RSD: (1) volcanic seismicity at Mammoth Mountain, California, (2) subduction-related seismicity in Oaxaca, Mexico, (3) induced seismicity in Central Alberta, Canada, and (4) induced seismicity in Harrison County, Ohio. In each case, RSD detects a similar or larger number of earthquakes than existing catalogs created using more time intensive methods. In Harrison County, RSD identifies 18 seismic sequences that correlate temporally and spatially to separate hydraulic fracturing operations, 15 of which were previously unreported. RSD utilizes a single seismometer for earthquake detection which enables seismicity to be quickly identified in poorly instrumented regions at the expense of relying on another method to locate the new detections. Due to the smaller computation overhead and success at distances up to ~50 km, RSD is well suited for real-time detection of low-magnitude earthquake swarms with permanent regional networks.

15. The Long-term Impacts of Earthquakes on Economic Growth

NASA Astrophysics Data System (ADS)

Lackner, S.

2016-12-01

The social science literature has so far not reached a consensus on whether and how earthquakes actually impact economic growth in the long-run. Several hypotheses have been suggested and some even argue for a positive impact. A general weakness in the literature, however, is the predominant use of inadequate measures for the exogenous natural hazard of an earthquake. The most common problems are the lack of individual event size (e.g. earthquake dummy or number of events), the use of magnitude instead of a measure for surface shaking, and endogeneity issues when traditional qualitative intensity scales or actual impact data is used. Here we use peak ground acceleration (PGA) as the ground motion intensity measure and investigate the impacts of earthquake shaking on long-run economic growth. We construct a data set from USGS ShakeMaps that can be considered the universe of global relevant earthquake ground shaking from 1973 to 2014. This data set is then combined with World Bank GDP data to conduct a regression analysis. Furthermore, the impacts of PGA on different industries and other economic variables such as employment and education are also investigated. This will on one hand help to identify the mechanism of how earthquakes impact long-run growth and also show potential impacts on other welfare indicators that are not captured by GDP. This is the first application of global earthquake shaking data to investigate long-term earthquake impacts.

16. Measuring the effectiveness of earthquake forecasting in insurance strategies

NASA Astrophysics Data System (ADS)

Mignan, A.; Muir-Wood, R.

2009-04-01

Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

17. Are you prepared for the next big earthquake in Alaska?

2006-01-01

Scientists have long recognized that Alaska has more earthquakes than any other region of the United States and is, in fact, one of the most seismically active areas of the world. The second-largest earthquake ever recorded shook the heart of southern Alaska on March 27th, 1964. The largest strike-slip slip earthquake in North America in almost 150 years occurred on the Denali Fault in central Alaska on November 3rd, 2002. “Great” earthquakes (larger than magnitude 8) have rocked the state on an average of once every 13 years since 1900. It is only a matter of time before another major earthquake will impact a large number of Alaskans.Alaska has changed significantly since the damaging 1964 earthquake, and the population has more than doubled. Many new buildings are designed to withstand intense shaking, some older buildings have been reinforced, and development has been discouraged in some particularly hazardous areas. Despite these precautions, future earthquakes may still cause damage to buildings, displace items within buildings, and disrupt the basic utilities that we take for granted. We must take every reasonable action to prepare for damaging earthquakes in order to lower these risks.

18. Earthquake warning system for Japan Railways’ bullet train; implications for disaster prevention in California

Nakamura, Y.; Tucker, B. E.

1988-01-01

Today, Japanese society is well aware of the prediction of the Tokai earthquake. It is estimated by the Tokyo earthquake. It is estimated by the Tokyo muncipal government that this predicted earthquake could kill 30,000 people. (this estimate is viewed by many as conservative; other Japanese government agencies have made estimates but they have not been published.) Reduction in the number deaths from 120,000 to 30,000 between the Kanto earthquake and the predicted Tokai earthquake is due in large part to the reduction in the proportion of wooden construction (houses).

19. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

NASA Astrophysics Data System (ADS)

Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

2017-04-01

Recently our understanding of tectonic faulting has been shaken by the discoveries of seismic tremor, low frequency earthquakes, slow slip events, and other models of fault slip. These phenomenas represent models of failure that were thought to be non-existent and theoretically impossible only a few years ago. Slow earthquakes are seismic phenomena in which the rupture of geological faults in the earth's crust occurs gradually without creating strong tremors. Despite the growing number of observations of slow earthquakes their origin remains unresolved. Studies show that the duration of slow earthquakes ranges from a few seconds to a few hundred seconds. The regular earthquakes with which most people are familiar release a burst of built-up stress in seconds, slow earthquakes release energy in ways that do little damage. This study focus on the characteristics of the Mw5.6 earthquake occurred in Sofia seismic zone on May 22nd, 2012. The Sofia area is the most populated, industrial and cultural region of Bulgaria that faces considerable earthquake risk. The Sofia seismic zone is located in South-western Bulgaria - the area with pronounce tectonic activity and proved crustal movement. In 19th century the city of Sofia (situated in the centre of the Sofia seismic zone) has experienced two strong earthquakes with epicentral intensity of 10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK64).The 2012 quake occurs in an area characterized by a long quiescence (of 95 years) for moderate events. Moreover, a reduced number of small earthquakes have also been registered in the recent past. The Mw5.6 earthquake is largely felt on the territory of Bulgaria and neighbouring countries. No casualties and severe injuries have been reported. Mostly moderate damages were observed in the cities of Pernik and Sofia and their surroundings. These observations could be assumed indicative for a

20. Linking giant earthquakes with the subduction of oceanic fracture zones

NASA Astrophysics Data System (ADS)

Landgrebe, T. C.; Müller, R. D.; EathByte Group

2011-12-01

Giant subduction earthquakes are known to occur in areas not previously identified as prone to high seismic risk. This highlights the need to better identify subduction zone segments potentially dominated by relatively long (up to 1000 years and more) recurrence times of giant earthquakes. Global digital data sets represent a promising source of information for a multi-dimensional earthquake hazard analysis. We combine the NGDC global Significant Earthquakes database with a global strain rate map, gridded ages of the ocean floor, and a recently produced digital data set for oceanic fracture zones, major aseismic ridges and volcanic chains to investigate the association of earthquakes as a function of magnitude with age of the downgoing slab and convergence rates. We use a so-called Top-N recommendation method, a technology originally developed to search, sort, classify, and filter very large and often statistically skewed data sets on the internet, to analyse the association of subduction earthquakes sorted by magnitude with key parameters. The Top-N analysis is used to progressively assess how strongly particular "tectonic niche" locations (e.g. locations along subduction zones intersected with aseismic ridges or volcanic chains) are associated with sets of earthquakes in sorted order in a given magnitude range. As the total number N of sorted earthquakes is increased, by progressively including smaller-magnitude events, the so-called recall is computed, defined as the number of Top-N earthquakes associated with particular target areas divided by N. The resultant statistical measure represents an intuitive description of the effectiveness of a given set of parameters to account for the location of significant earthquakes on record. We use this method to show that the occurrence of great (magnitude ≥ 8) earthquakes on overriding plate segments is strongly biased towards intersections of oceanic fracture zones with subduction zones. These intersection regions are

1. Earthquakes, September-October 1984

Person, W.J.

1985-01-01

In the United States, Wyoming experienced a couple of moderate earthquakes, and off the coast of northern California, a strong earthquake shook much of the northern coast of California and parts of the Oregon coast.

2. Earthquakes, July-August 1991

Person, W.J.

1992-01-01

There was one major earthquake during this reporting period-a magnitude 7.1 shock off the coast of Northern California on August 17. Earthquake-related deaths were reported from Indonesia, Romania, Peru, and Iraq.

3. Return to work for severely injured survivors of the Christchurch earthquake: influences in the first 2 years.

PubMed

Nunnerley, Joanne; Dunn, Jennifer; McPherson, Kathryn; Hooper, Gary; Woodfield, Tim

2016-01-01

This study looked at the influences on the return to work (RTW) in the first 2 years for people severely injured in the 22 February 2011 Christchurch earthquake. We used a constructivist grounded theory approach using semi-structured interviews to collect data from 14 people injured in the earthquake. Analysis elicited three themes that appeared to influence the process of RTW following the Christchurch earthquake. Living the earthquake experience, the individual's experiences of the earthquake and how their injury framed their expectations; rebuilding normality, the desire of the participants to return to life as it was; while dealing with the secondary effects of the earthquake includes the earthquake specific effects which were both barriers and facilitators to returning to work. The consequences of the earthquake impacted on experience, process and outcome of RTW for those injured in the Christchurch Earthquake. Work and RTW appeared key tools to enhance recovery after serious injury following the earthquake. The altered physical, social and economic environment must be considered when working on the return to work (RTW) of individuals with earthquake injuries. Providing tangible emotional and social support so injured earthquake survivors feel safe in their workplace may facilitate RTW. Engaging early with employers may assist the RTW of injured earthquake survivors.

4. A prospective earthquake forecast experiment for Japan

NASA Astrophysics Data System (ADS)

Yokoi, Sayoko; Nanjo, Kazuyoshi; Tsuruoka, Hiroshi; Hirata, Naoshi

2013-04-01

One major focus of the current Japanese earthquake prediction research program (2009-2013) is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. On 1 November in 2009, we started the 1st earthquake forecast testing experiment for the Japan area. We use the unified JMA catalogue compiled by the Japan Meteorological Agency as authorized catalogue. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called All Japan, Mainland, and Kanto. A total of 91 models were submitted to CSEP-Japan, and are evaluated with the CSEP official suite of tests about forecast performance. In this presentation, we show the results of the experiment of the 3-month testing class for 5 rounds. HIST-ETAS7pa, MARFS and RI10K models corresponding to the All Japan, Mainland and Kanto regions showed the best score based on the total log-likelihood. It is also clarified that time dependency of model parameters is no effective factor to pass the CSEP consistency tests for the 3-month testing class in all regions. Especially, spatial distribution in the All Japan region was too difficult to pass consistency test due to multiple events at a bin. Number of target events for a round in the Mainland region tended to be smaller than model's expectation during all rounds, which resulted in rejections of consistency test because of overestimation. In the Kanto region, pass ratios of consistency tests in each model showed more than 80%, which was associated with good balanced forecasting of event

5. Damaging earthquakes: A scientific laboratory

Hays, Walter W.; ,

1996-01-01

This paper reviews the principal lessons learned from multidisciplinary postearthquake investigations of damaging earthquakes throughout the world during the past 15 years. The unique laboratory provided by a damaging earthquake in culturally different but tectonically similar regions of the world has increased fundamental understanding of earthquake processes, added perishable scientific, technical, and socioeconomic data to the knowledge base, and led to changes in public policies and professional practices for earthquake loss reduction.

6. Assessing Explosives Safety Risks, Deviations, And Consequences

DTIC Science & Technology

2009-07-31

Technical Paper 23 31 July 2009 DDESB Assessing Explosives Safety Risks, Deviations, And Consequences ...Deviations, And Consequences 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...and approaches to assist warfighters in executing their mission, conserving resources, and maximizing operational effectiveness . When mission risk

7. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

2009-01-01

Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

8. Sensing the earthquake

NASA Astrophysics Data System (ADS)

Bichisao, Marta; Stallone, Angela

2017-04-01

Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

9. Earthquakes, September-October 1978

Person, W.J.

1979-01-01

The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region.

10. Earthquakes, March-April, 1993

Person, Waverly J.

1993-01-01

Worldwide, only one major earthquake (7.0earthquake, a magnitude 7.2 shock, struck the Santa Cruz Islands region in the South Pacific on March 6. Earthquake-related deaths occurred in the Fiji Islands, China, and Peru.

11. Earthquakes, September-October 1991

Person, W.J.

1992-01-01

There were two major earthquakes (7.0-7.9) during this reporting period. the first was in the Solomon Islands on October 14 and the second was in India on October 19. Earthquake-related deaths were reported in Guatemala and India. Htere were no significant earthquakes in the United States during the period covered in this report.

12. Earthquakes, September-October 1993

Person, W.J.

1993-01-01

The fatalities in the United States were caused by two earthquakes in southern Oregon on September 21. These earthquakes, both with magnitude 6.0 and separated in time by about 2 hrs, led to the deaths of two people. One of these deaths was apparently due to a heart attack induced by the earthquake

13. Earthquakes, March-April 1991

Person, W.J.

1992-01-01

Two major earthquakes (7.0-7.9) occurred during this reporting period: a magnitude 7.6 in Costa Rica on April 22 and a magntidue 7.0 in the USSR on April 29. Destructive earthquakes hit northern Peru on April 4 and 5. There were no destructive earthquakes in the United States during this period.

14. Turkish Children's Ideas about Earthquakes

ERIC Educational Resources Information Center

Simsek, Canan Lacin

2007-01-01

Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

15. Earthquakes, May-June 1991

Person, W.J.

1992-01-01

In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage.

16. [Earthquakes in El Salvador].

PubMed

de Ville de Goyet, C

2001-02-01

The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

17. Statistical physics approach to earthquake occurrence and forecasting

NASA Astrophysics Data System (ADS)

de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

2016-04-01

There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different

18. Geomorphic legacy of medieval Himalayan earthquakes in the Pokhara Valley

NASA Astrophysics Data System (ADS)

Schwanghart, Wolfgang; Bernhardt, Anne; Stolle, Amelie; Hoelzmann, Philipp; Adhikari, Basanta R.; Andermann, Christoff; Tofelde, Stefanie; Merchel, Silke; Rugel, Georg; Fort, Monique; Korup, Oliver

2016-04-01

The Himalayas and their foreland belong to the world's most earthquake-prone regions. With millions of people at risk from severe ground shaking and associated damages, reliable data on the spatial and temporal occurrence of past major earthquakes is urgently needed to inform seismic risk analysis. Beyond the instrumental record such information has been largely based on historical accounts and trench studies. Written records provide evidence for damages and fatalities, yet are difficult to interpret when derived from the far-field. Trench studies, in turn, offer information on rupture histories, lengths and displacements along faults but involve high chronological uncertainties and fail to record earthquakes that do not rupture the surface. Thus, additional and independent information is required for developing reliable earthquake histories. Here, we present exceptionally well-dated evidence of catastrophic valley infill in the Pokhara Valley, Nepal. Bayesian calibration of radiocarbon dates from peat beds, plant macrofossils, and humic silts in fine-grained tributary sediments yields a robust age distribution that matches the timing of nearby M>8 earthquakes in ~1100, 1255, and 1344 AD. The upstream dip of tributary valley fills and X-ray fluorescence spectrometry of their provenance rule out local sediment sources. Instead, geomorphic and sedimentary evidence is consistent with catastrophic fluvial aggradation and debris flows that had plugged several tributaries with tens of meters of calcareous sediment from the Annapurna Massif >60 km away. The landscape-changing consequences of past large Himalayan earthquakes have so far been elusive. Catastrophic aggradation in the wake of two historically documented medieval earthquakes and one inferred from trench studies underscores that Himalayan valley fills should be considered as potential archives of past earthquakes. Such valley fills are pervasive in the Lesser Himalaya though high erosion rates reduce

19. PAGER--Rapid assessment of an earthquake?s impact

Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

2010-01-01

PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

20. What is the earthquake fracture energy?

NASA Astrophysics Data System (ADS)

Di Toro, G.; Nielsen, S. B.; Passelegue, F. X.; Spagnuolo, E.; Bistacchi, A.; Fondriest, M.; Murphy, S.; Aretusini, S.; Demurtas, M.

2016-12-01

The energy budget of an earthquake is one of the main open questions in earthquake physics. During seismic rupture propagation, the elastic strain energy stored in the rock volume that bounds the fault is converted into (1) gravitational work (relative movement of the wall rocks bounding the fault), (2) in- and off-fault damage of the fault zone rocks (due to rupture propagation and frictional sliding), (3) frictional heating and, of course, (4) seismic radiated energy. The difficulty in the budget determination arises from the measurement of some parameters (e.g., the temperature increase in the slipping zone which constraints the frictional heat), from the not well constrained size of the energy sinks (e.g., how large is the rock volume involved in off-fault damage?) and from the continuous exchange of energy from different sinks (for instance, fragmentation and grain size reduction may result from both the passage of the rupture front and frictional heating). Field geology studies, microstructural investigations, experiments and modelling may yield some hints. Here we discuss (1) the discrepancies arising from the comparison of the fracture energy measured in experiments reproducing seismic slip with the one estimated from seismic inversion for natural earthquakes and (2) the off-fault damage induced by the diffusion of frictional heat during simulated seismic slip in the laboratory. Our analysis suggests, for instance, that the so called earthquake fracture energy (1) is mainly frictional heat for small slips and (2), with increasing slip, is controlled by the geometrical complexity and other plastic processes occurring in the damage zone. As a consequence, because faults are rapidly and efficiently lubricated upon fast slip initiation, the dominant dissipation mechanism in large earthquakes may not be friction but be the off-fault damage due to fault segmentation and stress concentrations in a growing region around the fracture tip.

1. PERFORMANCE OF AN EARTHQUAKE EXCITED ROOF DIAPHRAGM.

Celebi, M.; Brady, G.; Safak, E.; Converse, A.; ,

1986-01-01

The objective of this paper is to study the earthquake performance of the roof diaphragm of the West Valley College gymnasium in Saratoga, California through a complete set of acceleration records obtained during the 24 April 1984 Morgan Hill Earthquake (M equals 6. 1). The roof diaphragm of the 112 ft. multiplied by 144 ft. rectangular, symmetric gymnasium consists of 3/8 in. plywood over tongue-and-groove sheathing attached to steel trusses supported by reinforced concrete columns and walls. Three sensors placed in the direction of each of the axes of the diaphragm facilitate the evaluation of in-plane deformation of the diaphragm. Other sensors placed at ground level measure vertical and horizontal motion of the building floor, and consequently allow the calculation of the relative motion of the diaphragm with respect to the ground level.

2. The Global Earthquake Model - Past, Present, Future

NASA Astrophysics Data System (ADS)

Smolka, Anselm; Schneider, John; Stein, Ross

2014-05-01

The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange. Sharing of data and risk information, best practices, and approaches across the globe are key to assessing risk more effectively. Through consortium driven global projects, open-source IT development and collaborations with more than 10 regions, leading experts are developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. The year 2013 has seen the completion of ten global data sets or components addressing various aspects of earthquake hazard and risk, as well as two GEM-related, but independently managed regional projects SHARE and EMME. Notably, the International Seismological Centre (ISC) led the development of a new ISC-GEM global instrumental earthquake catalogue, which was made publicly available in early 2013. It has set a new standard for global earthquake catalogues and has found widespread acceptance and application in the global earthquake community. By the end of 2014, GEM's OpenQuake computational platform will provide the OpenQuake hazard/risk assessment software and integrate all GEM data and information products. The public release of OpenQuake is planned for the end of this 2014, and will comprise the following datasets and models: • ISC-GEM Instrumental Earthquake Catalogue (released January 2013) • Global Earthquake History Catalogue [1000-1903] • Global Geodetic Strain Rate Database and Model • Global Active Fault Database • Tectonic Regionalisation Model • Global Exposure Database • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerabilities Database • Socio-Economic Vulnerability and Resilience Indicators • Seismic

3. Simulation of rockfalls triggered by earthquakes

Kobayashi, Y.; Harp, E.L.; Kagawa, T.

1990-01-01

A computer program to simulate the downslope movement of boulders in rolling or bouncing modes has been developed and applied to actual rockfalls triggered by the Mammoth Lakes, California, earthquake sequence in 1980 and the Central Idaho earthquake in 1983. In order to reproduce a movement mode where bouncing predominated, we introduced an artificial unevenness to the slope surface by adding a small random number to the interpolated value of the mid-points between the adjacent surveyed points. Three hundred simulations were computed for each site by changing the random number series, which determined distances and bouncing intervals. The movement of the boulders was, in general, rather erratic depending on the random numbers employed, and the results could not be seen as deterministic but stochastic. The closest agreement between calculated and actual movements was obtained at the site with the most detailed and accurate topographic measurements. ?? 1990 Springer-Verlag.

4. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

NASA Astrophysics Data System (ADS)

Wyss, Max

2013-04-01

being incorrect for scientific reasons and here I argue that it is also ineffective for psychological reasons. Instead of calming the people or by underestimating the hazard in strongly active areas by the GSHAP approach, they should be told quantitatively the consequences of the reasonably worst case and be motivated to prepare for it, whether or not it may hit the present or the next generation. In a worst case scenario for L'Aquila, the number of expected fatalities and injured should have been calculated for an event in the range of M6.5 to M7, as I did for a civil defense exercise in Umbria, Italy. With the prospect that approximately 500 people may die in an earthquake in the immediate or distant future, some residents might have built themselves an earthquake closet (similar to a simple tornado shelter) in a corner of their apartment, into which they might have dashed to safety at the onset of the P-wave before the destructive S-wave arrived. I conclude that in earthquake prone areas quantitative loss estimates due to a reasonable worst case earthquake should replace probabilistic hazard and risk estimates. This is a service, which experts owe the community. Insurance companies and academics may still find use for probabilistic estimates of losses, especially in areas of low seismic hazard, where the worst case scenario approach is less appropriate.

5. Theatre as Therapy, Therapy as Theatre Transforming the Memories and Trauma of the 21 September 1999 Earthquake in Taiwan

ERIC Educational Resources Information Center

Chang, Ivy I-Chu

2005-01-01

On 21 September 1999, a 7.3 magnitude earthquake in Taiwan destroyed more than 100,000 houses, causing 2,294 deaths and 8,737 injuries. In the aftermath of the earthquake, a great number of social workers and cultural workers were thrust into Nantou County and Taichung County of central Taiwan, the epicentre of the earthquake, to assist the…

6. Analysis of Landslides Triggered by October 2005, Kashmir Earthquake

PubMed Central

Mahmood, Irfan; Qureshi, Shahid Nadeem; Tariq, Shahina; Atique, Luqman; Iqbal, Muhammad Farooq

2015-01-01

Introduction: The October 2005, Kashmir earthquake main event was triggered along the Balakot-Bagh Fault which runs from Bagh to Balakot, and caused more damages in and around these areas. Major landslides were activated during and after the earthquake inflicting large damages in the area, both in terms of infrastructure and casualties. These landslides were mainly attributed to the minimum threshold of the earthquake, geology of the area, climatologic and geomorphologic conditions, mudflows, widening of the roads without stability assessment, and heavy rainfall after the earthquake. These landslides were mainly rock and debris falls. Hattian Bala rock avalanche was largest landslide associated with the earthquake which completely destroyed a village and blocked the valley creating a lake. Discussion: The present study shows that the fault rupture and fault geometry have direct influence on the distribution of landslides and that along the rupture zone a high frequency band of landslides was triggered. There was an increase in number of landslides due to 2005 earthquake and its aftershocks and that most of earthquakes have occurred along faults, rivers and roads. It is observed that the stability of landslide mass is greatly influenced by amplitude, frequency and duration of earthquake induced ground motion. Most of the slope failures along the roads resulted from the alteration of these slopes during widening of the roads, and seepages during the rainy season immediately after the earthquake. Conclusion: Landslides occurred mostly along weakly cemented and indurated rocks, colluvial sand and cemented soils. It is also worth noting that fissures and ground crack which were induced by main and after shock are still present and they pose a major potential threat for future landslides in case of another earthquake activity or under extreme weather conditions. PMID:26366324

7. Analysis of Landslides Triggered by October 2005, Kashmir Earthquake.

PubMed

Mahmood, Irfan; Qureshi, Shahid Nadeem; Tariq, Shahina; Atique, Luqman; Iqbal, Muhammad Farooq

2015-08-26

The October 2005, Kashmir earthquake main event was triggered along the Balakot-Bagh Fault which runs from Bagh to Balakot, and caused more damages in and around these areas. Major landslides were activated during and after the earthquake inflicting large damages in the area, both in terms of infrastructure and casualties. These landslides were mainly attributed to the minimum threshold of the earthquake, geology of the area, climatologic and geomorphologic conditions, mudflows, widening of the roads without stability assessment, and heavy rainfall after the earthquake. These landslides were mainly rock and debris falls. Hattian Bala rock avalanche was largest landslide associated with the earthquake which completely destroyed a village and blocked the valley creating a lake. The present study shows that the fault rupture and fault geometry have direct influence on the distribution of landslides and that along the rupture zone a high frequency band of landslides was triggered. There was an increase in number of landslides due to 2005 earthquake and its aftershocks and that most of earthquakes have occurred along faults, rivers and roads. It is observed that the stability of landslide mass is greatly influenced by amplitude, frequency and duration of earthquake induced ground motion. Most of the slope failures along the roads resulted from the alteration of these slopes during widening of the roads, and seepages during the rainy season immediately after the earthquake.  Landslides occurred mostly along weakly cemented and indurated rocks, colluvial sand and cemented soils. It is also worth noting that fissures and ground crack which were induced by main and after shock are still present and they pose a major potential threat for future landslides in case of another earthquake activity or under extreme weather conditions.

8. Earthquake Early Warning: User Education and Designing Effective Messages

NASA Astrophysics Data System (ADS)

Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

2014-12-01

The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

9. Earthquakes; March-April 1975

Person, W.J.

1975-01-01

There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971.

10. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

NASA Astrophysics Data System (ADS)

Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

2004-12-01

The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

11. Earthquake Prediction is Coming

ERIC Educational Resources Information Center

MOSAIC, 1977

1977-01-01

Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

12. Earthquake damage to schools

McCullough, Heather

1994-01-01

These unusual slides show earthquake damage to school and university buildings around the world. They graphically illustrate the potential danger to our schools, and to the welfare of our children, that results from major earthquakes. The slides range from Algeria, where a collapsed school roof is held up only by students' desks; to Anchorage, Alaska, where an elementary school structure has split in half; to California and other areas, where school buildings have sustained damage to walls, roofs, and chimneys. Interestingly, all the United States earthquakes depicted in this set of slides occurred either on a holiday or before or after school hours, except the 1935 tremor in Helena, Montana, which occurred at 11:35 am. It undoubtedly would have caused casualties had the schools not been closed days earlier by Helena city officials because of a damaging foreshock. Students in Algeria, the People's Republic of China, Armenia, and other stricken countries were not so fortunate. This set of slides represents 17 destructive earthquakes that occurred in 9 countries, and covers more than a century--from 1886 to 1988. Two of the tremors, both of which occurred in the United States, were magnitude 8+ on the Richter Scale, and four were magnitude 7-7.9. The events represented by the slides (see table below) claimed more than a quarter of a million lives.

13. Road Damage Following Earthquake

NASA Technical Reports Server (NTRS)

1989-01-01

Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

14. Baja Earthquake Perspective View

2010-04-05

The topography surrounding the Laguna Salada Fault in the Mexican state of Baja, California, is shown in this combined radar image and topographic view with data from NASA Shuttle Radar Topography Mission where a 7.2 earthquake struck on April 4, 2010.

15. The EM Earthquake Precursor

NASA Astrophysics Data System (ADS)

Jones, K. B., II; Saxton, P. T.

2013-12-01

Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

16. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

NASA Astrophysics Data System (ADS)

McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

2017-12-01

The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

17. WGCEP Historical California Earthquake Catalog

Felzer, Karen R.; Cao, Tianqing

2008-01-01

This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

18. Emergency surgical care delivery in post-earthquake Haiti: Partners in Health and Zanmi Lasante experience.

PubMed

McIntyre, Thomas; Hughes, Christopher D; Pauyo, Thierry; Sullivan, Stephen R; Rogers, Selwyn O; Raymonville, Maxi; Meara, John G

2011-04-01

The earthquake that struck Haiti on 12 January 2010 caused significant devastation to both the country and the existing healthcare infrastructure in both urban and rural areas. Most hospital and health care facilities in Port-au-Prince and the surrounding areas were significantly damaged or destroyed. Consequently, large groups of Haitians fled Port-au-Prince for rural areas to seek emergency medical and surgical care. In partnership with the Haitian Ministry of Health, Partners in Health (PIH) and Zanmi Lasante (ZL) have developed and maintained a network of regional and district hospitals in rural Haiti for over twenty-five years. This PIH/ZL system was ideally situated to accommodate the increased need for emergent surgical care in the immediate quake aftermath. The goal of the present study was to provide a cross-sectional assessment of surgical need and care delivery across PIH/ZL facilities after the earthquake in Haiti. We conducted a retrospective review of hospital case logs and operative records over the course of three weeks immediately following the earthquake. Roughly 3,000 patients were seen at PIH/ZL sites by a combination of Haitian and international surgical teams. During that period 513 emergency surgical cases were logged. Other than wound debridement, the most commonly performed procedure was fixation of long bone fractures, which constituted approximately one third of all surgical procedures. There was a significant demand for emergent surgical care after the earthquake in Haiti. The PIH/ZL hospital system played a critical role in addressing this acutely increased burden of surgical disease, and it allowed for large numbers of Haitians to receive needed surgical services. Our experiences reinforce that access to essential surgery is an essential pillar in public health.

19. The HayWired Earthquake Scenario—Earthquake Hazards

Detweiler, Shane T.; Wein, Anne M.

2017-04-24

The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

20. Earthquake Loss Scenarios: Warnings about the Extent of Disasters

NASA Astrophysics Data System (ADS)

Wyss, M.; Tolis, S.; Rosset, P.

2016-12-01

It is imperative that losses expected due to future earthquakes be estimated. Officials and the public need to be aware of what disaster is likely in store for them in order to reduce the fatalities and efficiently help the injured. Scenarios for earthquake parameters can be constructed to a reasonable accuracy in highly active earthquake belts, based on knowledge of seismotectonics and history. Because of the inherent uncertainties of loss estimates however, it would be desirable that more than one group calculate an estimate for the same area. By discussing these estimates, one may find a consensus of the range of the potential disasters and persuade officials and residents of the reality of the earthquake threat. To model a scenario and estimate earthquake losses requires data sets that are sufficiently accurate of the number of people present, the built environment, and if possible the transmission of seismic waves. As examples we use loss estimates for possible repeats of historic earthquakes in Greece that occurred between -464 and 700. We model future large Greek earthquakes as having M6.8 and rupture lengths of 60 km. In four locations where historic earthquakes with serious losses have occurred, we estimate that 1,000 to 1,500 people might perish, with an additional factor of four people injured. Defining the area of influence of these earthquakes as that with shaking intensities larger and equal to V, we estimate that 1.0 to 2.2 million people in about 2,000 settlements may be affected. We calibrate the QLARM tool for calculating intensities and losses in Greece, using the M6, 1999 Athens earthquake and matching the isoseismal information for six earthquakes, which occurred in Greece during the last 140 years. Comparing fatality numbers that would occur theoretically today with the numbers reported, and correcting for the increase in population, we estimate that the improvement of the building stock has reduced the mortality and injury rate in Greek

1. A global outer-rise/outer-trench-slope (OR/OTS) earthquake study

NASA Astrophysics Data System (ADS)

Wartman, J. M.; Kita, S.; Kirby, S. H.; Choy, G. L.

2009-12-01

Using improved seismic, bathymetric, satellite gravity and other geophysical data, we investigated the seismicity patterns and focal mechanisms of earthquakes in oceanic lithosphere off the trenches of the world that are large enough to be well recorded at teleseismic distances. A number of prominent trends are apparent, some of which have been previously recognized based on more limited data [1], and some of which are largely new [2-5]: (1) The largest events and the highest seismicity rates tend to occur where Mesozoic incoming plates are subducting at high rates (e.g., those in the western Pacific and the Banda segment of Indonesia). The largest events are predominantly shallow normal faulting (SNF) earthquakes. Less common are reverse-faulting (RF) events that tend to be deeper and to be present along with SNF events where nearby seamounts, seamount chains and other volcanic features are subducting [Seno and Yamanaka, 1996]. Blooms of SNF OR/OTS events usually occur just after and seaward of great interplate thrust (IPT) earthquakes but are far less common after smaller IPT events. (2) Plates subducting at slow rates (<20 mm/a) often show sparse OR/OTS seismicity. It is unclear if such low activity is a long-term feature of these systems or is a consequence of the long return times of great IPT earthquakes (e.g., the sparse OR/OTS seismicity before the 26 December 2004 M9.2 Sumatra earthquake and many subsequent OR/OTS events). (3) OR/OTS shocks are generally sparse or absent where incoming plates are very young (<20 Ma) (e.g., Cascadia, southern Mexico, Nankai, and South Shetlands). (4) Subducting plates of intermediate age (20 to about 65 Ma) display a diversity of focal mechanisms and seismicity patterns. In the Philippines, NE Indonesia, and Melanesia, bands of reverse faulting events occur at or near the trench and SNF earthquakes are restricted to OR/OTS sites further from the trench. (5) Clustering of OR/OTS events of all types commonly occurs where

2. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

2013-01-01

The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

3. Mass wasting triggered by the 5 March 1987 Ecuador earthquakes

Schuster, R.L.; Nieto, A.S.; O'Rourke, T. D.; Crespo, E.; Plaza-Nieto, G.

1996-01-01

On 5 March 1987, two earthquakes (Ms=6.1 and Ms=6.9) occurred about 25 km north of Reventador Volcano, along the eastern slopes of the Andes Mountains in northeastern Ecuador. Although the shaking damaged structures in towns and villages near the epicentral area, the economic and social losses directly due to earthquake shaking were small compared to the effects of catastrophic earthquake-triggered mass wasting and flooding. About 600 mm of rain fell in the region in the month preceding the earthquakes; thus, the surficial soils had high moisture contents. Slope failures commonly started as thin slides, which rapidly turned into fluid debris avalanches and debris flows. The surficial soils and thick vegetation covering them flowed down the slopes into minor tributaries and then were carried into major rivers. Rock and earth slides, debris avalanches, debris and mud flows, and resulting floods destroyed about 40 km of the Trans-Ecuadorian oil pipeline and the only highway from Quito to Ecuador's northeastern rain forests and oil fields. Estimates of total volume of earthquake-induced mass wastage ranged from 75-110 million m3. Economic losses were about US\$ 1 billion. Nearly all of the approximately 1000 deaths from the earthquakes were a consequence of mass wasting and/ or flooding.

4. On the of neural modeling of some dynamic parameters of earthquakes and fire safety in high-rise construction

NASA Astrophysics Data System (ADS)

Haritonova, Larisa

2018-03-01

The recent change in the correlation of the number of man-made and natural catastrophes is presented in the paper. Some recommendations are proposed to increase the firefighting efficiency in the high-rise buildings. The article analyzes the methodology of modeling seismic effects. The prospectivity of applying the neural modeling and artificial neural networks to analyze a such dynamic parameters of the earthquake foci as the value of dislocation (or the average rupture slip) is shown. The following two input signals were used: the power class and the number of earthquakes. The regression analysis has been carried out for the predicted results and the target outputs. The equations of the regression for the outputs and target are presented in the work as well as the correlation coefficients in training, validation, testing, and the total (All) for the network structure 2-5-5-1for the average rupture slip. The application of the results obtained in the article for the seismic design for the newly constructed buildings and structures and the given recommendations will provide the additional protection from fire and earthquake risks, reduction of their negative economic and environmental consequences.

5. Earthquake forecasting studies using radon time series data in Taiwan

NASA Astrophysics Data System (ADS)

Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

2017-04-01

For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

6. Compiling an earthquake catalogue for the Arabian Plate, Western Asia

NASA Astrophysics Data System (ADS)

Deif, Ahmed; Al-Shijbi, Yousuf; El-Hussain, Issa; Ezzelarab, Mohamed; Mohamed, Adel M. E.

2017-10-01

The Arabian Plate is surrounded by regions of relatively high seismicity. Accounting for this seismicity is of great importance for seismic hazard and risk assessments, seismic zoning, and land use. In this study, a homogenous earthquake catalogue of moment-magnitude (Mw) for the Arabian Plate is provided. The comprehensive and homogenous earthquake catalogue provided in the current study spatially involves the entire Arabian Peninsula and neighboring areas, covering all earthquake sources that can generate substantial hazard for the Arabian Plate mainland. The catalogue extends in time from 19 to 2015 with a total number of 13,156 events, of which 497 are historical events. Four polygons covering the entire Arabian Plate were delineated and different data sources including special studies, local, regional and international catalogues were used to prepare the earthquake catalogue. Moment magnitudes (Mw) that provided by original sources were given the highest magnitude type priority and introduced to the catalogues with their references. Earthquakes with magnitude differ from Mw were converted into this scale applying empirical relationships derived in the current or in previous studies. The four polygons catalogues were included in two comprehensive earthquake catalogues constituting the historical and instrumental periods. Duplicate events were identified and discarded from the current catalogue. The present earthquake catalogue was declustered in order to contain only independent events and investigated for the completeness with time of different magnitude spans.

7. U.S. Geological Survey (USGS) Earthquake Web Applications

NASA Astrophysics Data System (ADS)

Fee, J.; Martinez, E.

2015-12-01

USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/

8. Determining on-fault earthquake magnitude distributions from integer programming

NASA Astrophysics Data System (ADS)

Geist, Eric L.; Parsons, Tom

2018-02-01

Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106 variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions.

9. Determining on-fault earthquake magnitude distributions from integer programming

Geist, Eric L.; Parsons, Thomas E.

2018-01-01

Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106  variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions.

10. Weather Satellite Thermal IR Responses Prior to Earthquakes

NASA Technical Reports Server (NTRS)

OConnor, Daniel P.

2005-01-01

A number of observers claim to have seen thermal anomalies prior to earthquakes, but subsequent analysis by others has failed to produce similar findings. What exactly are these anomalies? Might they be useful for earthquake prediction? It is the purpose of this study to determine if thermal anomalies can be found in association with known earthquakes by systematically co-registering weather satellite images at the sub-pixel level and then determining if statistically significant responses occurred prior to the earthquake event. A new set of automatic co-registration procedures was developed for this task to accommodate all properties particular to weather satellite observations taken at night, and it relies on the general condition that the ground cools after sunset. Using these procedures, we can produce a set of temperature-sensitive satellite images for each of five selected earthquakes (Algeria 2003; Bhuj, India 2001; Izmit, Turkey 2001; Kunlun Shan, Tibet 2001; Turkmenistan 2000) and thus more effectively investigate heating trends close to the epicenters a few hours prior to the earthquake events. This study will lay tracks for further work in earthquake prediction and provoke the question of the exact nature of the thermal anomalies.

11. Historical earthquake research in Austria

NASA Astrophysics Data System (ADS)

Hammerl, Christa

2017-12-01

Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

12. Limits on great earthquake size at subduction zones

NASA Astrophysics Data System (ADS)

McCaffrey, R.

2012-12-01

Subduction zones are where the world's greatest earthquakes occur due to the large fault area available to slip. Yet some subduction zones are thought to be immune from these massive events, where quake size is limited by some physical processes or properties. Accordingly, the size of the 2011 Tohoku-oki Mw 9.0 earthquake caught some in the earthquake research community by surprise. The expectations of these massive quakes have been driven in the past by reliance on our short, incomplete history of earthquakes and causal relationships derived from it. The logic applied is that if a great earthquake has not happened in the past, that we know of, one cannot happen in the future. Using the ~100-year global earthquake seismological history, and in some cases extended with geologic observations, relationships between maximum earthquake sizes and other properties of subduction zones are suggested, leading to the notion that some subduction zones, like the Japan Trench, would never produce a magnitude ~9 event. Empirical correlations of earthquake behavior with other subduction parameters can give false positive results when the data are incomplete or incorrect, of small numbers and numerous attributes are examined. Given multi-century return times of the greatest earthquakes, ignorance of those return times and our relatively limited temporal observation span (in most places), I suggest that we cannot yet rule out great earthquakes at any subduction zones. Alternatively, using the length of a subduction zone that is available for slip as the predominant factor in determining maximum earthquake size, we cannot rule out that any subduction zone of a few hundred kilometers or more in length may be capable of producing a magnitude 9 or larger earthquake. Based on this method, the expected maximum size for the Japan Trench was 9.0 (McCaffrey, Geology, p. 263, 2008). The same approach indicates that a M > 9 off Java, with twice the population density as Honshu and much lower

13. Has El Salvador Fault Zone produced M ≥ 7.0 earthquakes? The 1719 El Salvador earthquake

NASA Astrophysics Data System (ADS)

Canora, C.; Martínez-Díaz, J.; Álvarez-Gómez, J.; Villamor, P.; Ínsua-Arévalo, J.; Alonso-Henar, J.; Capote, R.

2013-05-01

Historically, large earthquakes, Mw ≥ 7.0, in the Εl Salvador area have been attributed to activity in the Cocos-Caribbean subduction zone. Τhis is correct for most of the earthquakes of magnitude greater than 6.5. However, recent paleoseismic evidence points to the existence of large earthquakes associated with rupture of the Εl Salvador Fault Ζone, an Ε-W oriented strike slip fault system that extends for 150 km through central Εl Salvador. Τo calibrate our results from paleoseismic studies, we have analyzed the historical seismicity of the area. In particular, we suggest that the 1719 earthquake can be associated with paleoseismic activity evidenced in the Εl Salvador Fault Ζone. Α reinterpreted isoseismal map for this event suggests that the damage reported could have been a consequence of the rupture of Εl Salvador Fault Ζone, rather than rupture of the subduction zone. Τhe isoseismal is not different to other upper crustal earthquakes in similar tectonovolcanic environments. We thus challenge the traditional assumption that only the subduction zone is capable of generating earthquakes of magnitude greater than 7.0 in this region. Τhis result has broad implications for future risk management in the region. Τhe potential occurrence of strong ground motion, significantly higher and closer to the Salvadorian populations that those assumed to date, must be considered in seismic hazard assessment studies in this area.

14. Genomic mutation consequence calculator.

PubMed

Major, John E

2007-11-15

The genomic mutation consequence calculator (GMCC) is a tool that will reliably and quickly calculate the consequence of arbitrary genomic mutations. GMCC also reports supporting annotations for the specified genomic region. The particular strength of the GMCC is it works in genomic space, not simply in spliced transcript space as some similar tools do. Within gene features, GMCC can report on the effects on splice site, UTR and coding regions in all isoforms affected by the mutation. A considerable number of genomic annotations are also reported, including: genomic conservation score, known SNPs, COSMIC mutations, disease associations and others. The manual interface also offers link outs to various external databases and resources. In batch mode, GMCC returns a csv file which can easily be parsed by the end user. GMCC is intended to support the many tumor resequencing efforts, but can be useful to any study investigating genomic mutations.

15. Earthquake likelihood model testing

Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

2007-01-01

INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

16. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

NASA Astrophysics Data System (ADS)

Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

2012-12-01

Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

17. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes

PubMed Central

Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

2016-01-01

Background A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities’ preparedness and response capabilities and to mitigate future consequences. Methods An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model’s algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. Results the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. Conclusion The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties. PMID:26959647

18. The HayWired Earthquake Scenario

Detweiler, Shane T.; Wein, Anne M.

2017-04-24

interconnectedness of infrastructure, society, and our economy. How would this earthquake scenario, striking close to Silicon Valley, impact our interconnected world in ways and at a scale we have not experienced in any previous domestic earthquake?The area of present-day Contra Costa, Alameda, and Santa Clara Counties contended with a magnitude-6.8 earthquake in 1868 on the Hayward Fault. Although sparsely populated then, about 30 people were killed and extensive property damage resulted. The question of what an earthquake like that would do today has been examined before and is now revisited in the HayWired scenario. Scientists have documented a series of prehistoric earthquakes on the Hayward Fault and are confident that the threat of a future earthquake, like that modeled in the HayWired scenario, is real and could happen at any time. The team assembled to build this scenario has brought innovative new approaches to examining the natural hazards, impacts, and consequences of such an event. Such an earthquake would also be accompanied by widespread liquefaction and landslides, which are treated in greater detail than ever before. The team also considers how the now-prototype ShakeAlert earthquake early warning system could provide useful public alerts and automatic actions.Scientific Investigations Report 2017–5013 and accompanying data releases are the products of an effort led by the USGS, but this body of work was created through the combined efforts of a large team including partners who have come together to form the HayWired Coalition (see chapter A). Use of the HayWired scenario has already begun. More than a full year of intensive partner engagement, beginning in April 2017, is being directed toward producing the most in-depth look ever at the impacts and consequences of a large earthquake on the Hayward Fault. With the HayWired scenario, our hope is to encourage and support the active ongoing engagement of the entire community of the San Francisco Bay region by

19. Geophysical Anomalies and Earthquake Prediction

NASA Astrophysics Data System (ADS)

Jackson, D. D.

2008-12-01

Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

20. Earthquake technology fights crime

Lahr, John C.; Ward, Peter L.; Stauffer, Peter H.; Hendley, James W.

1996-01-01

Scientists with the U.S. Geological Survey have adapted their methods for quickly finding the exact source of an earthquake to the problem of locating gunshots. On the basis of this work, a private company is now testing an automated gunshot-locating system in a San Francisco Bay area community. This system allows police to rapidly pinpoint and respond to illegal gunfire, helping to reduce crime in our neighborhoods.

1. The Earthquake Early Warning System in Japan (Invited)

NASA Astrophysics Data System (ADS)

Mori, J. J.; Yamada, M.

2010-12-01

In Japan, the earthquake early warning system (Kinkyu Jishin Sokuhou in Japanese) maintained by the Japan Meterological Agency (JMA) has been in operation and sending pubic information since October 1, 2007. Messages have been broadcast on television and radio to warn of strong shaking to the public. The threshold for broadcasting a message is an estimated intensity of JMA 5 lower, which is approximately equivalent to MM VII to VIII. During the period from October 2007 through August 2010, messages have been sent 9 times for earthquakes of magnitude 5.2 to 7.0. There have been a few instances of significantly over-estimating or under-estimating the predicted shaking, but in general the performance of the system has been quite good. The quality of the detection system depends on the dense network of high-quality seismometers that cover the Japanese Islands. Consequently, the system works very well for events on or close to the 4 main islands, but there is more uncertainty for events near the smaller and more distant islands where the density of instrumentation is much less The Early Warning System is also tied to an extensive education program so that the public can react appropriately in the short amount of time given by the warning. There appears to be good public support in Japan, where people have become accustomed to a high level of fast information on a daily basis. There has also been development of a number of specific safety applications in schools and industry that work off the backbone information provided in the national system.

2. Electromagnetic earthquake triggering phenomena: State-of-the-art research and future developments

NASA Astrophysics Data System (ADS)

Zeigarnik, Vladimir; Novikov, Victor

2014-05-01

Developed in the 70s of the last century in Russia unique pulsed power systems based on solid propellant magneto-hydrodynamic (MHD) generators with an output of 10-500 MW and operation duration of 10 to 15 s were applied for an active electromagnetic monitoring of the Earth's crust to explore its deep structure, oil and gas electrical prospecting, and geophysical studies for earthquake prediction due to their high specific power parameters, portability, and a capability of operation under harsh climatic conditions. The most interesting and promising results were obtained during geophysical experiments at the test sites located at Pamir and Northern Tien Shan mountains, when after 1.5-2.5 kA electric current injection into the Earth crust through an 4 km-length emitting dipole the regional seismicity variations were observed (increase of number of weak earthquakes within a week). Laboratory experiments performed by different teams of the Institute of Physics of the Earth, Joint Institute for High Temperatures, and Research Station of Russian Academy of Sciences on observation of acoustic emission behavior of stressed rock samples during their processing by electric pulses demonstrated similar patterns - a burst of acoustic emission (formation of cracks) after application of current pulse to the sample. Based on the field and laboratory studies it was supposed that a new kind of earthquake triggering - electromagnetic initiation of weak seismic events has been observed, which may be used for the man-made electromagnetic safe release of accumulated tectonic stresses and, consequently, for earthquake hazard mitigation. For verification of this hypothesis some additional field experiments were carried out at the Bishkek geodynamic proving ground with application of pulsed ERGU-600 facility, which provides 600 A electric current in the emitting dipole. An analysis of spatio-temporal redistribution of weak regional seismicity after ERGU-600 pulses, as well as a response

3. Large Earthquakes Disrupt Groundwater System by Breaching Aquitards

NASA Astrophysics Data System (ADS)

Wang, C. Y.; Manga, M.; Liao, X.; Wang, L. P.

2016-12-01

Changes of groundwater system by large earthquakes are widely recognized. Some changes have been attributed to increases in the vertical permeability but basic questions remain: How do increases in the vertical permeability occur? How frequent do they occur? How fast does the vertical permeability recover after the earthquake? Is there a quantitative measure for detecting the occurrence of aquitard breaching? Here we attempt to answer these questions by examining data accumulated in the past 15 years. Analyses of increased stream discharges and their geochemistry after large earthquakes show evidence that the excess water originates from groundwater released from high elevations by large increase of the vertical permeability. Water-level data from a dense network of clustered wells in a sedimentary basin near the epicenter of the 1999 M7.6 Chi-Chi earthquake in western Taiwan show that, while most confined aquifers remained confined after the earthquake, about 10% of the clustered wells show evidence of coseismic breaching of aquitards and a great increase of the vertical permeability. Water level in wells without evidence of coseismic breaching of aquitards show similar tidal response before and after the earthquake; wells with evidence of coseismic breaching of aquitards, on the other hand, show distinctly different tidal response before and after the earthquake and that the aquifers became hydraulically connected for many months thereafter. Breaching of aquitards by large earthquakes has significant implications for a number of societal issues such as the safety of water resources, the security of underground waste repositories, and the production of oil and gas. The method demonstrated here may be used for detecting the occurrence of aquitard breaching by large earthquakes in other seismically active areas.

4. Heart attacks and the Newcastle earthquake.

PubMed

Dobson, A J; Alexander, H M; Malcolm, J A; Steele, P L; Miles, T A

To test the hypothesis that stress generated by the Newcastle earthquake led to increased risk of heart attack and coronary death. A natural experiment. People living in the Newcastle and Lake Macquarie local government areas of New South Wales, Australia. At 10.27 a.m. on 28 December 1989 Newcastle was struck by an earthquake measuring 5.6 on the Richter scale. Myocardial infarction and coronary death defined by the criteria of the WHO MONICA Project and hospital admissions for coronary disease before and after the earthquake and in corresponding periods in previous years. Well established, concurrent data collection systems were used. There were six fatal myocardial infarctions and coronary deaths among people aged under 70 years after the earthquake in the period 28-31 December 1989. Compared with the average number of deaths at this time of year this was unusually high (P = 0.016). Relative risks for this four-day period were: fatal myocardial infarction and coronary death, 1.67 (95% confidence interval [Cl]: 0.72, 3.17); non-fatal definite myocardial infarction, 1.05 (95% Cl: 0.05, 2.22); non-fatal possible myocardial infarction, 1.34 (95% Cl: 0.67, 1.91); hospital admissions for myocardial infarction or other ischaemic heart disease, 1.27 (95% Cl: 0.83, 1.66). There was no evidence of increased risk during the following four months. The magnitude of increased risk of death was slightly less than that previously reported after earthquakes in Greece. The data provide weak evidence that acute emotional and physical stress may trigger myocardial infarction and coronary death.

5. Housing Damage Following Earthquake

NASA Technical Reports Server (NTRS)

1989-01-01

An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

6. Earthquake Potential in Myanmar

NASA Astrophysics Data System (ADS)

Aung, Hla Hla

Myanmar region is generally believed to be an area of high earthquake potential from the point of view of seismic activity which has been low compared to the surrounding regions like Indonesia, China, and Pakistan. Geoscientists and seismologists predicted earthquakes to occur in the area north of the Sumatra-Andaman Islands, i.e. the southwest and west part of Myanmar. Myanmar tectonic setting relative to East and SE Asia is rather peculiar and unique with different plate tectonic models but similar to the setting of western part of North America. Myanmar crustal blocks are caught within two lithospheric plates of India and Indochina experiencing oblique subduction with major dextral strike-slip faulting of the Sagaing fault. Seismic tomography and thermal structure of India plate along the Sunda subduction zone vary from south to north. Strong partitioning in central Andaman basin where crustal fragmentation and northward dispersion of Burma plate by back-arc spreading mechanism has been operating since Neogene. Northward motion of Burma plate relative to SE Asia would dock against the major continent further north and might have caused the accumulation of strain which in turn will be released as earthquakes in the future.

7. Earthquake Source Mechanics

NASA Astrophysics Data System (ADS)

The past 2 decades have seen substantial progress in our understanding of the nature of the earthquake faulting process, but increasingly, the subject has become an interdisciplinary one. Thus, although the observation of radiated seismic waves remains the primary tool for studying earthquakes (and has been increasingly focused on extracting the physical processes occurring in the “source”), geological studies have also begun to play a more important role in understanding the faulting process. Additionally, defining the physical underpinning for these phenomena has come to be an important subject in experimental and theoretical rock mechanics.In recognition of this, a Maurice Ewing Symposium was held at Arden House, Harriman, N.Y. (the former home of the great American statesman Averill Harriman), May 20-23, 1985. The purpose of the meeting was to bring together the international community of experimentalists, theoreticians, and observationalists who are engaged in the study of various aspects of earthquake source mechanics. The conference was attended by more than 60 scientists from nine countries (France, Italy, Japan, Poland, China, the United Kingdom, United States, Soviet Union, and the Federal Republic of Germany).

8. Sand Volcano Following Earthquake

NASA Technical Reports Server (NTRS)

1989-01-01

Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

9. Complex earthquake rupture and local tsunamis

Geist, E.L.

2002-01-01

In contrast to far-field tsunami amplitudes that are fairly well predicted by the seismic moment of subduction zone earthquakes, there exists significant variation in the scaling of local tsunami amplitude with respect to seismic moment. From a global catalog of tsunami runup observations this variability is greatest for the most frequently occuring tsunamigenic subduction zone earthquakes in the magnitude range of 7 < Mw < 8.5. Variability in local tsunami runup scaling can be ascribed to tsunami source parameters that are independent of seismic moment: variations in the water depth in the source region, the combination of higher slip and lower shear modulus at shallow depth, and rupture complexity in the form of heterogeneous slip distribution patterns. The focus of this study is on the effect that rupture complexity has on the local tsunami wave field. A wide range of slip distribution patterns are generated using a stochastic, self-affine source model that is consistent with the falloff of far-field seismic displacement spectra at high frequencies. The synthetic slip distributions generated by the stochastic source model are discretized and the vertical displacement fields from point source elastic dislocation expressions are superimposed to compute the coseismic vertical displacement field. For shallow subduction zone earthquakes it is demonstrated that self-affine irregularities of the slip distribution result in significant variations in local tsunami amplitude. The effects of rupture complexity are less pronounced for earthquakes at greater depth or along faults with steep dip angles. For a test region along the Pacific coast of central Mexico, peak nearshore tsunami amplitude is calculated for a large number (N = 100) of synthetic slip distribution patterns, all with identical seismic moment (Mw = 8.1). Analysis of the results indicates that for earthquakes of a fixed location, geometry, and seismic moment, peak nearshore tsunami amplitude can vary by a

10. Global Review of Induced and Triggered Earthquakes

NASA Astrophysics Data System (ADS)

Foulger, G. R.; Wilson, M.; Gluyas, J.; Julian, B. R.; Davies, R. J.

2016-12-01

Natural processes associated with very small incremental stress changes can modulate the spatial and temporal occurrence of earthquakes. These processes include tectonic stress changes, the migration of fluids in the crust, Earth tides, surface ice and snow loading, heavy rain, atmospheric pressure, sediment unloading and groundwater loss. It is thus unsurprising that large anthropogenic projects which may induce stress changes of a similar size also modulate seismicity. As human development accelerates and industrial projects become larger in scale and more numerous, the number of such cases is increasing. That mining and water-reservoir impoundment can induce earthquakes has been accepted for several decades. Now, concern is growing about earthquakes induced by activities such as hydraulic fracturing for shale-gas extraction and waste-water disposal via injection into boreholes. As hydrocarbon reservoirs enter their tertiary phases of production, seismicity may also increase there. The full extent of human activities thought to induce earthquakes is, however, much wider than generally appreciated. We have assembled as near complete a catalog as possible of cases of earthquakes postulated to have been induced by human activity. Our database contains a total of 705 cases and is probably the largest compilation made to date. We include all cases where reasonable arguments have been made for anthropogenic induction, even where these have been challenged in later publications. Our database presents the results of our search but leaves judgment about the merits of individual cases to the user. We divide anthropogenic earthquake-induction processes into: a) Surface operations, b) Extraction of mass from the subsurface, c) Introduction of mass into the subsurface, and d) Explosions. Each of these categories is divided into sub-categories. In some cases, categorization of a particular case is tentative because more than one anthropogenic activity may have preceded or been

11. Distant, delayed and ancient earthquake-induced landslides

NASA Astrophysics Data System (ADS)

Havenith, Hans-Balder; Torgoev, Almaz; Braun, Anika; Schlögel, Romy; Micu, Mihai

2016-04-01

On the basis of a new classification of seismically induced landslides we outline particular effects related to the delayed and distant triggering of landslides. Those cannot be predicted by state-of-the-art methods. First, for about a dozen events the 'predicted' extension of the affected area is clearly underestimated. The most problematic cases are those for which far-distant triggering of landslides had been reported, such as for the 1988 Saguenay earthquake. In Central Asia reports for such cases are known for areas marked by a thick cover of loess. One possible contributing effect could be a low-frequency resonance of the thick soils induced by distant earthquakes, especially those in the Pamir - Hindu Kush seismic region. Such deep focal and high magnitude (>>7) earthquakes are also found in Europe, first of all in the Vrancea region (Romania). For this area and others in Central Asia we computed landslide event sizes related to scenario earthquakes with M>7.5. The second particular and challenging type of triggering is the one delayed with respect to the main earthquake event: case histories have been reported for the Racha earthquake in 1991 when several larger landslides only started moving 2 or 3 days after the main shock. Similar observations were also made after other earthquake events in the U.S., such as after the 1906 San Francisco, the 1949 Tacoma, the 1959 Hebgen Lake and the 1983 Bora Peak earthquakes. Here, we will present a series of detailed examples of (partly monitored) mass movements in Central Asia that mainly developed after earthquakes, some even several weeks after the main shock: e.g. the Tektonik and Kainama landslides triggered in 1992 and 2004, respectively. We believe that the development of the massive failures is a consequence of the opening of tension cracks during the seismic shaking and their filling up with water during precipitations that followed the earthquakes. The third particular aspect analysed here is the use of large

12. Machine Learning Seismic Wave Discrimination: Application to Earthquake Early Warning

NASA Astrophysics Data System (ADS)

Li, Zefeng; Meier, Men-Andrin; Hauksson, Egill; Zhan, Zhongwen; Andrews, Jennifer

2018-05-01

Performance of earthquake early warning systems suffers from false alerts caused by local impulsive noise from natural or anthropogenic sources. To mitigate this problem, we train a generative adversarial network (GAN) to learn the characteristics of first-arrival earthquake P waves, using 300,000 waveforms recorded in southern California and Japan. We apply the GAN critic as an automatic feature extractor and train a Random Forest classifier with about 700,000 earthquake and noise waveforms. We show that the discriminator can recognize 99.2% of the earthquake P waves and 98.4% of the noise signals. This state-of-the-art performance is expected to reduce significantly the number of false triggers from local impulsive noise. Our study demonstrates that GANs can discover a compact and effective representation of seismic waves, which has the potential for wide applications in seismology.

13. Earthquakes, September-October 1980

Person, W.J.

1981-01-01

There were two major (magnitudes 7.0-7.9) earthquakes during this reporting period; a magnitude (M) 7.3 in Algeria where many people were killed or injured and extensive damage occurred, and an M=7.2 in the Loyalty Islands region of the South Pacific. Japan was struck by a damaging earthquake on September 24, killing two people and causing injuries. There were no damaging earthquakes in the United States.

14. Earthquakes, November-December 1991

Person, W.J.

1992-01-01

There were three major earthquakes (7.0-7.9) during the last two months of the year: a magntidue 7.0 on November 19 in Columbia, a magnitude 7.4 in the Kuril Islands on December 22, and a magnitude 7.1 in the South Sandwich Islands on December 27. Earthquake-related deaths were reported in Colombia, Yemen, and Iran. there were no significant earthquakes in the United States during this reporting period.

15. Classification of Earthquake-triggered Landslide Events - Review of Classical and Particular Cases

NASA Astrophysics Data System (ADS)

Braun, A.; Havenith, H. B.; Schlögel, R.

2016-12-01

Seismically induced landslides often contribute to a significant degree to the losses related to earthquakes. The identification of possible extends of landslide affected areas can help to target emergency measures when an earthquake occurs or improve the resilience of inhabited areas and critical infrastructure in zones of high seismic hazard. Moreover, landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes in paleoseismic studies, allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. Inspired by classical reviews of earthquake induced landslides, e.g. by Keefer or Jibson, we present here a review of factors contributing to earthquake triggered slope failures based on an `event-by-event' classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, `Intensity', `Fault', `Topographic energy', `Climatic conditions' and `Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be crosschecked. We present cases where our prediction model performs well and discuss particular cases

16. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: Altai-Sayan Region

NASA Astrophysics Data System (ADS)

Kossobokov, V. G.; Nekrasova, A.

2017-12-01

We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10N(M, L) = A + B·(5 - M) + C·log10L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum credible magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g. peak ground acceleration, PGA, or macro-seismic intensity etc.). After a rigorous testing against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory, etc.). This, USLE based, methodology of seismic hazard and risks assessment is applied to the territory of Altai-Sayan Region, of Russia. The study supported by the Russian Science Foundation Grant No. 15-17-30020.

17. Earthquake hazard and risk assessment based on Unified Scaling Law for Earthquakes: Greater Caucasus and Crimea

NASA Astrophysics Data System (ADS)

Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

2018-05-01

We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10 N(M, L) = A + B·(5 - M) + C·log10 L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within a seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g., peak ground acceleration, PGA, or macro-seismic intensity). After a rigorous verification against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory). The methodology of seismic hazard and risk assessment is illustrated by application to the territory of Greater Caucasus and Crimea.

18. Earthquakes, July-August 1992

Person, W.J.

1992-01-01

There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan.

19. Earthquake Forecasting System in Italy

NASA Astrophysics Data System (ADS)

Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

2017-12-01

In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

20. ViscoSim Earthquake Simulator

Pollitz, Fred

2012-01-01

Synthetic seismicity simulations have been explored by the Southern California Earthquake Center (SCEC) Earthquake Simulators Group in order to guide long‐term forecasting efforts related to the Unified California Earthquake Rupture Forecast (Tullis et al., 2012a). In this study I describe the viscoelastic earthquake simulator (ViscoSim) of Pollitz, 2009. Recapitulating to a large extent material previously presented by Pollitz (2009, 2011) I describe its implementation of synthetic ruptures and how it differs from other simulators being used by the group.

1. Earthquakes in the United States

Stover, C.

1977-01-01

To supplement data in the report Preliminary Determination of Epicenters (PDE), the National earthquake Information Service (NEIS) also publishes a quarterly circular, Earthquakes in the United States. This provides information on the felt area of U.S earthquakes and their intensity. The main purpose is to describe the larger effects of these earthquakes so that they can be used in seismic risk studies, site evaluations for nuclear power plants, and answering inquiries by the general public.

2. Earthquake clustering in modern seismicity and its relationship with strong historical earthquakes around Beijing, China

NASA Astrophysics Data System (ADS)

Wang, Jian; Main, Ian G.; Musson, Roger M. W.

2017-11-01

Beijing, China's capital city, is located in a typical intraplate seismic belt, with relatively high-quality instrumental catalogue data available since 1970. The Chinese historical earthquake catalogue contains six strong historical earthquakes of Ms ≥ 6 around Beijing, the earliest in 294 AD. This poses a significant potential hazard to one of the most densely populated and economically active parts of China. In some intraplate areas, persistent clusters of events associated with historical events can occur over centuries, for example, the ongoing sequence in the New Madrid zone of the eastern US. Here we will examine the evidence for such persistent clusters around Beijing. We introduce a metric known as the `seismic density index' that quantifies the degree of clustering of seismic energy release. For a given map location, this multi-dimensional index depends on the number of events, their magnitudes, and the distances to the locations of the surrounding population of earthquakes. We apply the index to modern instrumental catalogue data between 1970 and 2014, and identify six clear candidate zones. We then compare these locations to earthquake epicentre and seismic intensity data for the six largest historical earthquakes. Each candidate zone contains one of the six historical events, and the location of peak intensity is within 5 km or so of the reported epicentre in five of these cases. In one case—the great Ms 8 earthquake of 1679—the peak is closer to the area of strongest shaking (Intensity XI or more) than the reported epicentre. The present-day event rates are similar to those predicted by the modified Omori law but there is no evidence of ongoing decay in event rates. Accordingly, the index is more likely to be picking out the location of persistent weaknesses in the lithosphere. Our results imply zones of high seismic density index could be used in principle to indicate the location of unrecorded historical of palaeoseismic events, in China and

3. Remote sensing and earthquake risk: A (re)insurance perspective

NASA Astrophysics Data System (ADS)

Smolka, Anselm; Siebert, Andreas

2013-04-01

The insurance sector is faced with two issues regarding earthquake risk: the estimation of rarely occurring losses from large events and the assessment of the average annual net loss. For this purpose, knowledge is needed of actual event losses, of the distribution of exposed values, and of their vulnerability to earthquakes. To what extent can remote sensing help the insurance industry fulfil these tasks, and what are its limitations? In consequence of more regular and high-resolution satellite coverage, we have seen earth observation and remote sensing methods develop over the past years to a stage where they appear to offer great potential for addressing some shortcomings of the data underlying risk assessment. These include lack of statistical representativeness and lack of topicality. Here, remote sensing can help in the following areas: • Inventories of exposed objects (pre- and post-disaster) • Projection of small-scale ground-based vulnerability classification surveys to a full inventory • Post-event loss assessment But especially from an insurance point of view, challenges remain. The strength of airborne remote sensing techniques lies in outlining heavily damaged areas where damage is caused by easily discernible structural failure, i.e. total or partial building collapse. Examples are the Haiti earthquake (with minimal insured loss) and the tsunami-stricken areas in the Tohoku district of Japan. What counts for insurers, however, is the sum of monetary losses. The Chile, the Christchurch and the Tohoku earthquakes each caused insured losses in the two-digit billion dollar range. By far the greatest proportion of these insured losses were due to non-structural damage to buildings, machinery and equipment. Even with the Tohoku event, no more than 30% of the total material damage was caused by the tsunami according to preliminary surveys, and this figure includes damage due to earthquake shock which was unrecognisable after the passage of the tsunami

4. Glacial Earthquakes: Monitoring Greenland's Glaciers Using Broadband Seismic Data

NASA Astrophysics Data System (ADS)

Olsen, K.; Nettles, M.

2017-12-01

The Greenland ice sheet currently loses 400 Gt of ice per year, and up to half of that mass loss comes from icebergs calving from marine-terminating glaciers (Enderlin et al., 2014). Some of the largest icebergs produced by Greenland's glaciers generate magnitude 5 seismic signals when they calve. These glacial earthquakes are recorded by seismic stations around the world. Full-waveform inversion and analysis of glacial earthquakes provides a low-cost tool to identify where and when gigaton-sized icebergs calve, and to track this important mass-loss mechanism in near-real-time. Fifteen glaciers in Greenland are known to have produced glacial earthquakes, and the annual number of these events has increased by a factor of six over the past two decades (e.g., Ekström et al., 2006; Olsen and Nettles, 2017). Since 2000, the number of glacial earthquakes on Greenland's west coast has increased dramatically. Our analysis of three recent years of data shows that more glacial earthquakes occurred on Greenland's west coast from 2011 - 2013 than ever before. In some cases, glacial-earthquake force orientations allow us to identify which section of a glacier terminus produced the iceberg associated with a particular event. We are able to track the timing of major changes in calving-front orientation at several glaciers around Greenland, as well as progressive failure along a single calving front over the course of hours to days. Additionally, the presence of glacial earthquakes resolves a glacier's grounded state, as glacial earthquakes occur only when a glacier terminates close to its grounding line.

5. Robust method to detect and locate local earthquakes by means of amplitude measurements.

NASA Astrophysics Data System (ADS)

del Puy Papí Isaba, María; Brückl, Ewald

2016-04-01

In this study we present a robust new method to detect and locate medium and low magnitude local earthquakes. This method is based on an empirical model of the ground motion obtained from amplitude data of earthquakes in the area of interest, which were located using traditional methods. The first step of our method is the computation of maximum resultant ground velocities in sliding time windows covering the whole period of interest. In the second step, these maximum resultant ground velocities are back-projected to every point of a grid covering the whole area of interest while applying the empirical amplitude - distance relations. We refer to these back-projected ground velocities as pseudo-magnitudes. The number of operating seismic stations in the local network equals the number of pseudo-magnitudes at each grid-point. Our method introduces the new idea of selecting the minimum pseudo-magnitude at each grid-point for further analysis instead of searching for a minimum of the L2 or L1 norm. In case no detectable earthquake occurred, the spatial distribution of the minimum pseudo-magnitudes constrains the magnitude of weak earthquakes hidden in the ambient noise. In the case of a detectable local earthquake, the spatial distribution of the minimum pseudo-magnitudes shows a significant maximum at the grid-point nearest to the actual epicenter. The application of our method is restricted to the area confined by the convex hull of the seismic station network. Additionally, one must ensure that there are no dead traces involved in the processing. Compared to methods based on L2 and even L1 norms, our new method is almost wholly insensitive to outliers (data from locally disturbed seismic stations). A further advantage is the fast determination of the epicenter and magnitude of a seismic event located within a seismic network. This is possible due to the method of obtaining and storing a back-projected matrix, independent of the registered amplitude, for each seismic

6. Principles for selecting earthquake motions in engineering design of large dams

Krinitzsky, E.L.; Marcuson, William F.

1983-01-01

This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at

7. Urban Earthquake Shaking and Loss Assessment

NASA Astrophysics Data System (ADS)

Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

2009-04-01

of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

8. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

NASA Astrophysics Data System (ADS)

Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

2018-02-01

The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

9. Rapid Estimates of Rupture Extent for Large Earthquakes Using Aftershocks

NASA Astrophysics Data System (ADS)

Polet, J.; Thio, H. K.; Kremer, M.

2009-12-01

The spatial distribution of aftershocks is closely linked to the rupture extent of the mainshock that preceded them and a rapid analysis of aftershock patterns therefore has potential for use in near real-time estimates of earthquake impact. The correlation between aftershocks and slip distribution has frequently been used to estimate the fault dimensions of large historic earthquakes for which no, or insufficient, waveform data is available. With the advent of earthquake inversions that use seismic waveforms and geodetic data to constrain the slip distribution, the study of aftershocks has recently been largely focused on enhancing our understanding of the underlying mechanisms in a broader earthquake mechanics/dynamics framework. However, in a near real-time earthquake monitoring environment, in which aftershocks of large earthquakes are routinely detected and located, these data may also be effective in determining a fast estimate of the mainshock rupture area, which would aid in the rapid assessment of the impact of the earthquake. We have analyzed a considerable number of large recent earthquakes and their aftershock sequences and have developed an effective algorithm that determines the rupture extent of a mainshock from its aftershock distribution, in a fully automatic manner. The algorithm automatically removes outliers by spatial binning, and subsequently determines the best fitting “strike” of the rupture and its length by projecting the aftershock epicenters onto a set of lines that cross the mainshock epicenter with incremental azimuths. For strike-slip or large dip-slip events, for which the surface projection of the rupture is recti-linear, the calculated strike correlates well with the strike of the fault and the corresponding length, determined from the distribution of aftershocks projected onto the line, agrees well with the rupture length. In the case of a smaller dip-slip rupture with an aspect ratio closer to 1, the procedure gives a measure

10. [Bioethics in catastrophe situations such as earthquakes].

PubMed

León, C Francisco Javier

2012-01-01

A catastrophe of the magnitude of the earthquake and tsunami that hit Chile not long ago, forces us to raise some questions that we will try to answer from a philosophical, ethical and responsibility viewpoints. An analysis of the basic principles of bioethics is also justified. A natural catastrophe is not, by itself, moral or immoral, fair or unfair. However, its consequences could certainly be regarded as such, depending on whether they could have been prevented or mitigated. We will identify those individuals, who have the ethical responsibility to attend the victims and the ethical principles that must guide the tasks of healthcare and psychological support teams. The minimal indispensable actions to obtain an adequate social and legal protection of vulnerable people, must be defined according to international guidelines. These reflections are intended to improve the responsibility of the State and all the community, to efficiently prevent and repair the material and psychological consequences of such a catastrophe.

11. Statistical validation of earthquake related observations

NASA Astrophysics Data System (ADS)

Kossobokov, V. G.

2011-12-01

The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

12. Human casualties in earthquakes: Modelling and mitigation

Spence, R.J.S.; So, E.K.M.

2011-01-01

Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

13. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

Jones, Lucile M.

1994-01-01

The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

14. After Fukushima: managing the consequences of a radiological release.

PubMed

Fitzgerald, Joe; Wollner, Samuel B; Adalja, Amesh A; Morhard, Ryan; Cicero, Anita; Inglesby, Thomas V

2012-06-01

Even amidst the devastation following the earthquake and tsunami in Japan that killed more than 20,000 people, it was the accident at the Fukushima Daiichi nuclear power plant that led the country's prime minister, Naoto Kan, to fear for "the very existence of the Japanese nation." While accidents that result in mass radiological releases have been rare throughout the operating histories of existing nuclear power plants, the growing number of plants worldwide increases the likelihood that such releases will occur again in the future. Nuclear power is an important source of energy in the U.S. and will be for the foreseeable future. Accidents far smaller in scale than the one in Fukushima could have major societal consequences. Given the extensive, ongoing Nuclear Regulatory Commission (NRC) and industry assessment of nuclear power plant safety and preparedness issues, the Center for Biosecurity of UPMC focused on offsite policies and plans intended to reduce radiation exposure to the public in the aftermath of an accident. This report provides an assessment of Japan's efforts at nuclear consequence management; identifies concerns with current U.S. policies and practices for "outside the fence" management of such an event in the U.S.; and makes recommendations for steps that can be taken to strengthen U.S. government, industry, and community response to large-scale accidents at nuclear power plants.

15. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

NASA Astrophysics Data System (ADS)

Pasten, D.

2017-12-01

The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

16. Factors affecting household adoption of an evacuation plan in American Samoa after the 2009 earthquake and tsunami.

PubMed

Apatu, Emma J I; Gregg, Chris E; Richards, Kasie; Sorensen, Barbara Vogt; Wang, Liang

2013-08-01

American Samoa is still recovering from the debilitating consequences of the September 29, 2009 tsunami. Little is known about current household preparedness in American Samoa for future earthquakes and tsunamis. Thus, this study sought to enumerate the number of households with an earthquake and tsunami evacuation plan and to identify predictors of having a household evacuation plan through a post-tsunami survey conducted in July 2011. Members of 300 households were interviewed in twelve villages spread across regions of the principle island of Tutuila. Multiple logistic regression showed that being male, having lived in one's home for < 30 years, and having a friend who suffered damage to his or her home during the 2009 tsunami event increased the likelihood of having a household evacuation plan. The prevalence of tsunami evacuation planning was 35% indicating that survivors might feel that preparation is not necessary given effective adaptive responses during the 2009 event. Results suggest that emergency planners and public health officials should continue with educational outreach to families to spread awareness around the importance of developing plans for future earthquakes and tsunamis to help mitigate human and structural loss from such natural disasters. Additional research is needed to better understand the linkages between pre-event planning and effective evacuation responses as were observed in the 2009 events.

17. Factors Affecting Household Adoption of an Evacuation Plan in American Samoa after the 2009 Earthquake and Tsunami

PubMed Central

Gregg, Chris E; Richards, Kasie; Sorensen, Barbara Vogt; Wang, Liang

2013-01-01

American Samoa is still recovering from the debilitating consequences of the September 29, 2009 tsunami. Little is known about current household preparedness in American Samoa for future earthquakes and tsunamis. Thus, this study sought to enumerate the number of households with an earthquake and tsunami evacuation plan and to identify predictors of having a household evacuation plan through a post-tsunami survey conducted in July 2011. Members of 300 households were interviewed in twelve villages spread across regions of the principle island of Tutuila. Multiple logistic regression showed that being male, having lived in one's home for < 30 years, and having a friend who suffered damage to his or her home during the 2009 tsunami event increased the likelihood of having a household evacuation plan. The prevalence of tsunami evacuation planning was 35% indicating that survivors might feel that preparation is not necessary given effective adaptive responses during the 2009 event. Results suggest that emergency planners and public health officials should continue with educational outreach to families to spread awareness around the importance of developing plans for future earthquakes and tsunamis to help mitigate human and structural loss from such natural disasters. Additional research is needed to better understand the linkages between pre-event planning and effective evacuation responses as were observed in the 2009 events. PMID:24349889

18. Review of variations in Mw < 7 earthquake motions on position and tec (Mw = 6.5 aegean sea earthquake sample)

NASA Astrophysics Data System (ADS)

Yildirim, O.; Inyurt, S.; Mekik, C.

2015-10-01

Turkey is a country located in Middle Latitude zone and in which tectonic activity is intensive. Lastly, an earthquake of magnitude 6.5Mw occurred at Aegean Sea offshore on date 24 May 2014 at 12:25 UTC and it lasted approximately 40 s. The said earthquake was felt also in Greece, Romania and Bulgaria in addition to Turkey. In recent years seismic origin ionospheric anomaly detection studies have been done with TEC (Total Electron Contents) generated from GNSS (Global Navigation Satellite System) signals and the findings obtained have been revealed. In this study, TEC and positional variations have been examined seperately regarding the earthquake which occurred in the Aegean Sea. Then The correlation of the said ionospheric variation with the positional variation has been investigated. For this purpose, total fifteen stations have been used among which the data of four numbers of CORS-TR stations in the seismic zone (AYVL, CANA, IPSA, YENC) and IGS and EUREF stations are used. The ionospheric and positional variations of AYVL, CANA, IPSA and YENC stations have been examined by Bernese 5.0v software. When the (PPP-TEC) values produced as result of the analysis are examined, it has been understood that in the four stations located in Turkey, three days before the earthquake at 08:00 and 10:00 UTC, the TEC values were approximately 4 TECU above the upper limit TEC value. Still in the same stations, one day before the earthquake at 06:00, 08:00 and 10:00 UTC, it is being shown that the TEC values were approximately 5 TECU below the lower limit TEC value. On the other hand, the GIM-TEC values published by the CODE center have been examined. Still in all stations, it has been observed that three days before the earthquake the TEC values in the time portions of 08:00 and 10:00 UTC were approximately 2 TECU above, one day before the earthquake at 06:00, 08:00 and 10:00 UTC, the TEC values were approximately 4 TECU below the lower limit TEC value. Again, by using the same

19. Earthquake Information System

NASA Technical Reports Server (NTRS)

1991-01-01

IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

20. Earthquake Scaling Relations

NASA Astrophysics Data System (ADS)

Jordan, T. H.; Boettcher, M.; Richardson, E.

2002-12-01

Using scaling relations to understand nonlinear geosystems has been an enduring theme of Don Turcotte's research. In particular, his studies of scaling in active fault systems have led to a series of insights about the underlying physics of earthquakes. This presentation will review some recent progress in developing scaling relations for several key aspects of earthquake behavior, including the inner and outer scales of dynamic fault rupture and the energetics of the rupture process. The proximate observations of mining-induced, friction-controlled events obtained from in-mine seismic networks have revealed a lower seismicity cutoff at a seismic moment Mmin near 109 Nm and a corresponding upper frequency cutoff near 200 Hz, which we interpret in terms of a critical slip distance for frictional drop of about 10-4 m. Above this cutoff, the apparent stress scales as M1/6 up to magnitudes of 4-5, consistent with other near-source studies in this magnitude range (see special session S07, this meeting). Such a relationship suggests a damage model in which apparent fracture energy scales with the stress intensity factor at the crack tip. Under the assumption of constant stress drop, this model implies an increase in rupture velocity with seismic moment, which successfully predicts the observed variation in corner frequency and maximum particle velocity. Global observations of oceanic transform faults (OTFs) allow us to investigate a situation where the outer scale of earthquake size may be controlled by dynamics (as opposed to geologic heterogeneity). The seismicity data imply that the effective area for OTF moment release, AE, depends on the thermal state of the fault but is otherwise independent of fault's average slip rate; i.e., AE ~ AT, where AT is the area above a reference isotherm. The data are consistent with β = 1/2 below an upper cutoff moment Mmax that increases with AT and yield the interesting scaling relation Amax ~ AT1/2. Taken together, the OTF

1. Earthquakes March-April 1992

Person, Waverly J.

1992-01-01

The months of March and April were quite active seismically speaking. There was one major earthquake (7.0Earthquake-related deaths were reported in Iran, Costa Rica, Turkey, and Germany.

2. Earthquakes on Your Dinner Table

NASA Astrophysics Data System (ADS)

Alexeev, N. A.; Tape, C.; Alexeev, V. A.

2016-12-01

Earthquakes have interesting physics applicable to other phenomena like propagation of waves, also, they affect human lives. This study focused on three questions, how: depth, distance from epicenter and ground hardness affect earthquake strength. Experimental setup consisted of a gelatin slab to simulate crust. The slab was hit with a weight and earthquake amplitude was measured. It was found that earthquake amplitude was larger when the epicenter was deeper, which contradicts observations and probably was an artifact of the design. Earthquake strength was inversely proportional to the distance from the epicenter, which generally follows reality. Soft and medium jello were implanted into hard jello. It was found that earthquakes are stronger in softer jello, which was a result of resonant amplification in soft ground. Similar results are found in Minto Flats, where earthquakes are stronger and last longer than in the nearby hills. Earthquakes waveforms from Minto Flats showed that that the oscillations there have longer periods compared to the nearby hills with harder soil. Two gelatin pieces with identical shapes and different hardness were vibrated on a platform at varying frequencies in order to demonstrate that their resonant frequencies are statistically different. This phenomenon also occurs in Yukon Flats.

3. Make an Earthquake: Ground Shaking!

ERIC Educational Resources Information Center

Savasci, Funda

2011-01-01

The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

4. Earthquakes Threaten Many American Schools

ERIC Educational Resources Information Center

Bailey, Nancy E.

2010-01-01

Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

5. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

PubMed Central

Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

2014-01-01

Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

6. Earthquake Catalogue of the Caucasus

NASA Astrophysics Data System (ADS)

Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

2016-12-01

The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

7. Influence of very-long-distance earthquakes on the ionosphere?

NASA Astrophysics Data System (ADS)

Liperovskaya, E. V.; Meister, C.-V.; Biagi, P.-F.; Liperovsky, V. A.; Rodkin, M. V.

2009-04-01

In the present work, variations of the critical frequency foF2 obtained every hour by the ionospheric sounding station Tashkent (41.3oN, 69.6oE) in the years 1964-1996 are considered. Mean values of data found at day-time between 11 LT and 16 LT are investigated. Disturbances of foF2 related to earthquakes are studied on the background of seasonal, geomagnetic, 11-years and 27-days solar variations. Normalized values F are used in the analysis, which are obtained excluding the seasonal run by subtracting the mean value of foF2 during the time interval of 14 days, from 7 days before the earthquake until seven days after the event, and dividing the result on its standard deviation. Days with high solar (Wolf number > 200) and geomagnetic (Î£Kp > 25) disturbances are excluded from the analysis. Using the method of superposition of epoches it is concluded, that at the day of the earthquake the foF2 value decreases a) in case of earthquakes with magnitudes M > 6.5 at any place on the Earth, if the depth h of the epicentre satisfies h < 200 km, b) in connection with earthquakes with magnitudes 6.5 > M > 6.0 occurring in the Middle Asia region, if h < 70 km is satisfied, and c) in connection with earthquakes with magnitudes 6.0 > M > 5.5 appearing at a distance from Tashkent smaller than 1000 km if one has h < 70 km. In all investigated cases the reliability of the effect is larger than 95 %. The ratio of the number of earthquakes with a decrease of the foF2-value to the number of earthquakes where foF2 grows is about 2. The decrease of the foF2-value is also obtained some hours before and some hours - a day - after the event. Thus, one may assume that before an earthquake happening at a long distance, in the vicinity of the sounding station seismo-gravity waves with periods between half an hour and a few hours propagate through the earth's core. After long-distance earthquakes, seismic waves propagate in the vicinity of the sounding station. But in both cases, the

8. Intermediate-term earthquake prediction

Knopoff, L.

1990-01-01

The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

9. Testing earthquake source inversion methodologies

Page, M.; Mai, P.M.; Schorlemmer, D.

2011-01-01

Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

10. Earthquake hazards: a national threat

,

2006-01-01

Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

11. Early Earthquakes of the Americas

NASA Astrophysics Data System (ADS)

Ni, James

2004-11-01

Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

12. Testing an earthquake prediction algorithm

Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

1997-01-01

A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

13. Earthquake Simulator Finds Tremor Triggers

Johnson, Paul

2018-01-16

Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional âaftershock zoneâ of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

14. Did the Zipingpu Reservoir trigger the 2008 Wenchuan earthquake?

Ge, S.; Liu, M.; Lu, N.; Godt, J.W.; Luo, G.

2009-01-01

The devastating May 2008 Wenchuan earthquake (Mw 7.9) resulted from thrust of the Tibet Plateau on the Longmen Shan fault zone, a consequence of the Indo-Asian continental collision. Many have speculated on the role played by the Zipingpu Reservoir, impounded in 2005 near the epicenter, in triggering the earthquake. This study evaluates the stress changes in response to the impoundment of the Zipingpu Reservoir and assesses their impact on the Wenchuan earthquake. We show that the impoundment could have changed the Coulomb stress by -0.01 to 0.05 MPa at locations and depth consistent with reported hypocenter positions. This level of stress change has been shown to be significant in triggering earthquakes on critically stressed faults. Because the loading rate on the Longmen Shan fault is <0.005 MPa/yr, we thus suggest that the Zipingpu Reservoir potentially hastened the occurrence of the Wenchuan earthquake by tens to hundreds of years. Copyright 2009 by the American Geophysical Union.

15. Urban Policies and Earthquake Risk Mitigation

NASA Astrophysics Data System (ADS)

Sarlo, Antonella

2008-07-01

The paper aims at proposing some considerations about some recent experiences of research carried out on the theme of earthquake risk mitigation and combining policies and actions of mitigation with urban development strategies. The objective was to go beyond the classical methodological approach aiming at defining a "technical" evaluation of the earthquake risk through a procedure which can correlate the three "components" of danger, exposure and vulnerability. These researches experiment, in terms of methodology and application, with a new category of interpretation and strategy: the so-called Struttura Urbana Minima (Minimum urban structure). Actually, the introduction of the Struttura Urbana Minima establishes a different approach towards the theme of safety in the field of earthquake risk, since it leads to a wider viewpoint, combining the building aspect of the issue with the purely urban one, involving not only town planning, but also social and managerial implications. In this sense the constituent logic of these researches is strengthened by two fundamental issues: - The social awareness of earthquake; - The inclusion of mitigation policies in the ordinary strategies for town and territory management. Three main aspects of the first point, that is of the "social awareness of earthquake", characterize this issue and demand to be considered within a prevention policy: - The central role of the risk as a social production, - The central role of the local community consent, - The central role of the local community capability to plan Therefore, consent, considered not only as acceptance, but above all as participation in the elaboration and implementation of choices, plays a crucial role in the wider issue of prevention policies. As far as the second point is concerned, the inclusion of preventive mitigation policies in ordinary strategies for the town and territory management demands the identification of criteria of choice and priorities of intervention and

16. Testimonies to the L'Aquila earthquake (2009) and to the L'Aquila process

NASA Astrophysics Data System (ADS)

Kalenda, Pavel; Nemec, Vaclav

2014-05-01

members with manslaughter and negligence for failing to warn the public of the impending risk. Many international organizations were falsely interpreting the accusation and sentence at the first stage as a problem of impossibility to predict earthquakes. The same situation appeared when the verdict at the 1st stage was pronounced in October 2012. But this verdict is exclusively based on the personal behaviour of the sentenced persons in the course of ONE HOUR SESSION of the Great Risk Board in L'Aquila on March 31, 2009 and on the fact that two of them presented results of the session immediately to media and local population after the session. Terrible consequences of this irresponsible behavior initiated the final accusation shared by a relatively small but intellectually advanced number of families associated with victims of the earthquake. They all had a deep confidence to the top Italian seismologists who attended the meeting of the Commission. Special INGV web site founded by the "decreto INGV n.641 del 19/12/2012" asking for support letters contains the trial documentation (http://processoaquila.wordpress.com/) including the Italian version of the verdict unfortunately with incomplete or incorrect and mostly MISSING English translations.

17. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

2014-01-01

On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents

18. Twitter Seismology: Earthquake Monitoring and Response in a Social World

NASA Astrophysics Data System (ADS)

Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

2011-12-01

The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

19. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

NASA Astrophysics Data System (ADS)

Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

2016-04-01

Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

20. Foreshock and aftershocks in simple earthquake models.

PubMed

Kazemian, J; Tiampo, K F; Klein, W; Dominguez, R

2015-02-27

Many models of earthquake faults have been introduced that connect Gutenberg-Richter (GR) scaling to triggering processes. However, natural earthquake fault systems are composed of a variety of different geometries and materials and the associated heterogeneity in physical properties can cause a variety of spatial and temporal behaviors. This raises the question of how the triggering process and the structure interact to produce the observed phenomena. Here we present a simple earthquake fault model based on the Olami-Feder-Christensen and Rundle-Jackson-Brown cellular automata models with long-range interactions that incorporates a fixed percentage of stronger sites, or asperity cells, into the lattice. These asperity cells are significantly stronger than the surrounding lattice sites but eventually rupture when the applied stress reaches their higher threshold stress. The introduction of these spatial heterogeneities results in temporal clustering in the model that mimics that seen in natural fault systems along with GR scaling. In addition, we observe sequences of activity that start with a gradually accelerating number of larger events (foreshocks) prior to a main shock that is followed by a tail of decreasing activity (aftershocks). This work provides further evidence that the spatial and temporal patterns observed in natural seismicity are strongly influenced by the underlying physical properties and are not solely the result of a simple cascade mechanism.

1. Some more earthquakes from medieval Kashmir

NASA Astrophysics Data System (ADS)

Ahmad, Bashir; Shafi, Muzamil

2014-07-01

Kashmir has the peculiarity of having written history of almost 5,000 years. However, the description of earthquakes in the archival contents is patchy prior to 1500 a.d. Moreover, recent search shows that there exist certain time gaps in the catalogs presently in use especially at medieval level (1128-1586 a.d.). The presence of different ruling elites in association with socioeconomic and political conditions has in many ways confused the historical context of the medieval sources. However, by a meticulous review of the Sanskrit sources (between the twelfth and sixteenth century), it has been possible to identify unspecified but fair number (eight seismic events) of earthquakes that do not exist in published catalogs of Kashmir or whose dates are very difficult to establish. Moreover, historical sources reveal that except for events which occurred during Sultan Skinder's rule (1389-1413) and during the reign of King Zain-ul-Abidin (1420-1470), all the rediscovered seismic events went into oblivion, due mainly to the fact that the sources available dedicated their interests to the military events, which often tended to overshadow/superimpose over and even concealed natural events like earthquakes, resulting in fragmentary accounts and rendering them of little value for macroseismic intensity evaluation necessary for more efficient seismic hazard assessment.

2. Earthquake Emergency Education in Dushanbe, Tajikistan

ERIC Educational Resources Information Center

Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

2010-01-01

We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

3. Earthquake Hazard Assessment: Basics of Evaluation

NASA Astrophysics Data System (ADS)

Kossobokov, Vladimir

2016-04-01

Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of

4. Physics of Earthquake Rupture Propagation

NASA Astrophysics Data System (ADS)

Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

2018-05-01

A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

5. Laboratory investigations of earthquake dynamics

NASA Astrophysics Data System (ADS)

Xia, Kaiwen

In this thesis this will be attempted through controlled laboratory experiments that are designed to mimic natural earthquake scenarios. The earthquake dynamic rupturing process itself is a complicated phenomenon, involving dynamic friction, wave propagation, and heat production. Because controlled experiments can produce results without assumptions needed in theoretical and numerical analysis, the experimental method is thus advantageous over theoretical and numerical methods. Our laboratory fault is composed of carefully cut photoelastic polymer plates (Homahte-100, Polycarbonate) held together by uniaxial compression. As a unique unit of the experimental design, a controlled exploding wire technique provides the triggering mechanism of laboratory earthquakes. Three important components of real earthquakes (i.e., pre-existing fault, tectonic loading, and triggering mechanism) correspond to and are simulated by frictional contact, uniaxial compression, and the exploding wire technique. Dynamic rupturing processes are visualized using the photoelastic method and are recorded via a high-speed camera. Our experimental methodology, which is full-field, in situ, and non-intrusive, has better control and diagnostic capacity compared to other existing experimental methods. Using this experimental approach, we have investigated several problems: dynamics of earthquake faulting occurring along homogeneous faults separating identical materials, earthquake faulting along inhomogeneous faults separating materials with different wave speeds, and earthquake faulting along faults with a finite low wave speed fault core. We have observed supershear ruptures, subRayleigh to supershear rupture transition, crack-like to pulse-like rupture transition, self-healing (Heaton) pulse, and rupture directionality.

6. Global consequences of unsafe abortion.

PubMed

Singh, Susheela

2010-11-01

Unsafe abortion is a significant cause of death and ill health in women in the developing world. A substantial body of research on these consequences exists, although studies are of variable quality. However, unsafe abortion has a number of other significant consequences that are much less widely recognized. These include the economic consequences, the immediate costs of providing medical care for abortion-related complications, the costs of medical care for longer-term health consequences, lost productivity to the country, the impact on families and the community, and the social consequences that affect women and families. This article will review the scientific evidence on the consequences of unsafe abortion, highlight gaps in the evidence base, suggest areas where future research efforts are needed, and speculate on the future situation regarding consequences and evidence over the next 5-10 years. The information provided is useful and timely given the current heightened interest in the issue of unsafe abortion, growing from the recent focus of national and international agencies on reducing maternal mortality by 75% by 2015 (as one of the Millennium Development Goals established in 2000).

7. Aftershock communication during the Canterbury Earthquakes, New Zealand: implications for response and recovery in the built environment

Julia Becker,; Wein, Anne; Sally Potter,; Emma Doyle,; Ratliff, Jamie L.

2015-01-01

On 4 September 2010, a Mw7.1 earthquake occurred in Canterbury, New Zealand. Following the initial earthquake, an aftershock sequence was initiated, with the most significant aftershock being a Mw6.3 earthquake occurring on 22 February 2011. This aftershock caused severe damage to the city of Christchurch and building failures that killed 185 people. During the aftershock sequence it became evident that effective communication of aftershock information (e.g., history and forecasts) was imperative to assist with decision making during the response and recovery phases of the disaster, as well as preparedness for future aftershock events. As a consequence, a joint JCDR-USGS research project was initiated to investigate: • How aftershock information was communicated to organisations and to the public; • How people interpreted that information; • What people did in response to receiving that information; • What information people did and did not need; and • What decision-making challenges were encountered relating to aftershocks. Research was conducted by undertaking focus group meetings and interviews with a range of information providers and users, including scientists and science advisors, emergency managers and responders, engineers, communication officers, businesses, critical infrastructure operators, elected officials, and the public. The interviews and focus group meetings were recorded and transcribed, and key themes were identified. This paper focuses on the aftershock information needs for decision-making about the built environment post-earthquake, including those involved in response (e.g., for building assessment and management), recovery/reduction (e.g., the development of new building standards), and readiness (e.g. between aftershocks). The research has found that the communication of aftershock information varies with time, is contextual, and is affected by interactions among roles, by other information, and by decision objectives. A number

8. Prospective Evaluation of the Global Earthquake Activity Rate Model (GEAR1) Earthquake Forecast: Preliminary Results

NASA Astrophysics Data System (ADS)

Strader, Anne; Schorlemmer, Danijel; Beutin, Thomas

2017-04-01

The Global Earthquake Activity Rate Model (GEAR1) is a hybrid seismicity model, constructed from a loglinear combination of smoothed seismicity from the Global Centroid Moment Tensor (CMT) earthquake catalog and geodetic strain rates (Global Strain Rate Map, version 2.1). For the 2005-2012 retrospective evaluation period, GEAR1 outperformed both parent strain rate and smoothed seismicity forecasts. Since 1. October 2015, GEAR1 has been prospectively evaluated by the Collaboratory for the Study of Earthquake Predictability (CSEP) testing center. Here, we present initial one-year test results of the GEAR1, GSRM and GSRM2.1, as well as localized evaluation of GEAR1 performance. The models were evaluated on the consistency in number (N-test), spatial (S-test) and magnitude (M-test) distribution of forecasted and observed earthquakes, as well as overall data consistency (CL-, L-tests). Performance at target earthquake locations was compared between models using the classical paired T-test and its non-parametric equivalent, the W-test, to determine if one model could be rejected in favor of another at the 0.05 significance level. For the evaluation period from 1. October 2015 to 1. October 2016, the GEAR1, GSRM and GSRM2.1 forecasts pass all CSEP likelihood tests. Comparative test results show statistically significant improvement of GEAR1 performance over both strain rate-based forecasts, both of which can be rejected in favor of GEAR1. Using point process residual analysis, we investigate the spatial distribution of differences in GEAR1, GSRM and GSRM2 model performance, to identify regions where the GEAR1 model should be adjusted, that could not be inferred from CSEP test results. Furthermore, we investigate whether the optimal combination of smoothed seismicity and strain rates remains stable over space and time.

9. Modified mercalli intensities for nine earthquakes in central and western Washington between 1989 and 1999

Brocher, Thomas M.; Dewey, James W.; Cassidy, John F.

2017-08-15

We determine Modified Mercalli (Seismic) Intensities (MMI) for nine onshore earthquakes of magnitude 4.5 and larger that occurred in central and western Washington between 1989 and 1999, on the basis of effects reported in postal questionnaires, the press, and professional collaborators. The earthquakes studied include four earthquakes of M5 and larger: the M5.0 Deming earthquake of April 13, 1990, the M5.0 Point Robinson earthquake of January 29, 1995, the M5.4 Duvall earthquake of May 3, 1996, and the M5.8 Satsop earthquake of July 3, 1999. The MMI are assigned using data and procedures that evolved at the U.S. Geological Survey (USGS) and its Department of Commerce predecessors and that were used to assign MMI to felt earthquakes occurring in the United States between 1931 and 1986. We refer to the MMI assigned in this report as traditional MMI, because they are based on responses to postal questionnaires and on newspaper reports, and to distinguish them from MMI calculated from data contributed by the public by way of the internet. Maximum traditional MMI documented for the M5 and larger earthquakes are VII for the 1990 Deming earthquake, V for the 1995 Point Robinson earthquake, VI for the 1996 Duvall earthquake, and VII for the 1999 Satsop earthquake; the five other earthquakes were variously assigned maximum intensities of IV, V, or VI. Starting in 1995, the Pacific Northwest Seismic Network (PNSN) published MMI maps for four of the studied earthquakes, based on macroseismic observations submitted by the public by way of the internet. With the availability now of the traditional USGS MMI interpreted for all the sites from which USGS postal questionnaires were returned, the four Washington earthquakes join a rather small group of earthquakes for which both traditional USGS MMI and some type of internet-based MMI have been assigned. The values and distributions of the traditional MMI are broadly similar to the internet-based PNSN intensities; we discuss some

10. Management of limb fractures in a teaching hospital: comparison between Wenchuan and Yushu earthquakes.

PubMed

Min, Li; Tu, Chong-qi; Liu, Lei; Zhang, Wen-li; Yi, Min; Song, Yue-ming; Huang, Fu-guo; Yang, Tian-fu; Pei, Fu-xing

2013-01-01

To comparatively analyze the medical records of patients with limb fractures as well as rescue strategy in Wenchuan and Yushu earthquakes so as to provide references for post-earthquake rescue. We retrospectively investigated 944 patients sustaining limb fractures, including 891 in Wenchuan earthquake and 53 in Yushu earthquake, who were admitted to West China Hospital (WCH) of Sichuan University. In Wenchuan earthquake, WCH met its three peaks of limb fracture patients influx, on post-earthquake day (PED) 2, 8 and 14 respectively. Between PED 3-14, 585 patients were transferred from WCH to other hospitals outside the Sichuan Province. In Yushu earthquake, the maximum influx of limb fracture patients happened on PED 3, and no one was shifted to other hospitals. Both in Wenchuan and Yushu earthquakes, most limb fractures were caused by blunt strike and crush/burying. In Wenchuan earthquake, there were 396 (396/942, 42.0%) open limb fractures, including 28 Gustilo I, 201 Gustilo II and 167 Gustilo III injuries. But in Yushu earthquake, the incidence of open limb fracture was much lower (6/61, 9.8%). The percent of patients with acute complications in Wenchuan earthquake (167/891, 18.7%) was much higher than that in Yushu earthquake (5/53, 3.8%). In Wenchuan earthquake rescue, 1 018 surgeries were done, composed of debridement in 376, internal fixation in 283, external fixation in 119, and vacuum sealing drainage in 117, etc. While among the 64 surgeries in Yushu earthquake rescue, the internal fixation for limb fracture was mostly adopted. All patients received proper treatment and survived except one who died due to multiple organs failure in Wenchuan earthquake. Provision of suitable and sufficient medical care in a catastrophe can only be achieved by construction of sophisticated national disaster medical system, prediction of the injury types and number of injuries, and confirmation of participating hospitals?exact role. Based on the valuable rescue experiences

11. The threat of silent earthquakes

Cervelli, Peter

2004-01-01

Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

12. Earthquakes, November-December 1977

Person, W.J.

1978-01-01

In the United States, the largest earthquake during this reporting period was a magntidue 6.6 in the Andreanof Islands, which are part of the Aleutian Islands chain, on November 4 that caused some minor damage. Northern California was struck by a magnitude 4.8 earthquake on November 22 causing moderate damage in the Willits area. This was the most damaging quake in the United States during the year. Two major earthquakes of magntidues 7.0 or above to 14 for the year.

13. The Pocatello Valley, Idaho, earthquake

Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

1975-01-01

A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26.

14. Earthquake damage to transportation systems

McCullough, Heather

1994-01-01

Earthquakes represent one of the most destructive natural hazards known to man. A large magnitude earthquake near a populated area can affect residents over thousands of square kilometers and cause billions of dollars in property damage. Such an event can kill or injure thousands of residents and disrupt the socioeconomic environment for months, sometimes years. A serious result of a large-magnitude earthquake is the disruption of transportation systems, which limits post-disaster emergency response. Movement of emergency vehicles, such as police cars, fire trucks and ambulances, is often severely restricted. Damage to transportation systems is categorized below by cause including: ground failure, faulting, vibration damage, and tsunamis.

15. Landslides Triggered by the 2015 Gorkha, Nepal Earthquake

NASA Astrophysics Data System (ADS)

Xu, C.

2018-04-01

The 25 April 2015 Gorkha Mw 7.8 earthquake in central Nepal caused a large number of casualties and serious property losses, and also induced numerous landslides. Based on visual interpretation of high-resolution optical satellite images pre- and post-earthquake and field reconnaissance, we delineated 47,200 coseismic landslides with a total distribution extent more than 35,000 km2, which occupy a total area about 110 km2. On the basis of a scale relationship between landslide area (A) and volume (V), V = 1.3147 × A1.2085, the total volume of the coseismic landslides is estimated to be about 9.64 × 108 m3. Calculation yields that the landslide number density, area density, and volume density are 1.32 km-2, 0.31 %, and 0.027 m, respectively. The spatial distribution of these landslides is consistent with that of the mainshock and aftershocks and the inferred causative fault, indicating the effect of the earthquake energy release on the pattern on coseismic landslides. This study provides a new, more detailed and objective inventory of the landslides triggered by the Gorkha earthquake, which would be significant for further study of genesis of coseismic landslides, hazard assessment and the long-term impact of the slope failure on the geological environment in the earthquake-scarred region.

16. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

DOT National Transportation Integrated Search

2015-03-12

USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

17. Nonextensive models for earthquakes.

PubMed

Silva, R; França, G S; Vilar, C S; Alcaniz, J S

2006-02-01

We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment epsilon proportional to r3. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.

18. Nonextensive models for earthquakes

SciT

Silva, R.; Franca, G.S.; Vilar, C.S.

2006-02-15

We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment {epsilon}{proportional_to}r{sup 3}. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofisica.more » Although both approaches provide very similar values for the nonextensive parameter q, other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.« less

19. Earthquakes - on the moon

NASA Technical Reports Server (NTRS)

Nakamura, Y.

1981-01-01

Information obtained with the Apollo lunar seismic stations is discussed. The four types of natural seismic sources that have been identified are described, viz., thermal moonquakes, deep moonquakes, meteoroid impacts, and shallow moonquakes. It is suggested that: (1) the thermal quakes represent the slow cracking and movement of surface rocks; (2) the deep quakes are induced by the tide-generating force of the earth's gravity; (3) the meteoroids responsible for most of the observed impacts are in the mass range from 1 to 100 kg and are clustered in groups near the earth's orbit; and (4) the shallow quakes are similar to intraplate earthquakes and indicate that the moon is as seismically active as the interior regions of the earth's tectonic plates. The structure of the lunar interior as inferred from seismic signals due to both the last three natural sources and 'artificial' impacts of used spacecraft is examined in detail.

20. Some comparisons between mining-induced and laboratory earthquakes

McGarr, A.

1994-01-01

Although laboratory stick-slip friction experiments have long been regarded as analogs to natural crustal earthquakes, the potential use of laboratory results for understanding the earthquake source mechanism has not been fully exploited because of essential difficulties in relating seismographic data to measurements made in the controlled laboratory environment. Mining-induced earthquakes, however, provide a means of calibrating the seismic data in terms of laboratory results because, in contrast to natural earthquakes, the causative forces as well as the hypocentral conditions are known. A comparison of stick-slip friction events in a large granite sample with mining-induced earthquakes in South Africa and Canada indicates both similarities and differences between the two phenomena. The physics of unstable fault slip appears to be largely the same for both types of events. For example, both laboratory and mining-induced earthquakes have very low seismic efficiencies {Mathematical expression} where ??a is the apparent stress and {Mathematical expression} is the average stress acting on the fault plane to cause slip; nearly all of the energy released by faulting is consumed in overcoming friction. In more detail, the mining-induced earthquakes differ from the laboratory events in the behavior of ?? as a function of seismic moment M0. Whereas for the laboratory events ?????0.06 independent of M0, ?? depends quite strongly on M0 for each set of induced earthquakes, with 0.06 serving, apparently, as an upper bound. It seems most likely that this observed scaling difference is due to variations in slip distribution over the fault plane. In the laboratory, a stick-slip event entails homogeneous slip over a fault of fixed area. For each set of induced earthquakes, the fault area appears to be approximately fixed but the slip is inhomogeneous due presumably to barriers (zones of no slip) distributed over the fault plane; at constant {Mathematical expression}, larger

1. Post-Traumatic Stress Disorder and other mental disorders in the general population after Lorca’s earthquakes, 2011 (Murcia, Spain): A cross-sectional study

PubMed Central

Salmerón, Diego; Vilagut, Gemma; Tormo, Mª José; Ruíz-Merino, Guadalupe; Escámez, Teresa; Júdez, Javier; Martínez, Salvador; Koenen, Karestan C.; Navarro, Carmen; Alonso, Jordi; Kessler, Ronald C.

2017-01-01

Aims To describe the prevalence and severity of mental disorders and to examine differences in risk among those with and without a lifetime history prior to a moderate magnitude earthquake that took place in Lorca (Murcia, Spain) at roughly the mid-point (on May 11, 2011) of the time interval in which a regional epidemiological survey was already being carried out (June 2010 –May 2012). Methods The PEGASUS-Murcia project is a cross-sectional face-to-face interview survey of a representative sample of non-institutionalized adults in Murcia. Main outcome measures are prevalence and severity of anxiety, mood, impulse and substance disorders in the 12 months previous to the survey, assessed using the Composite International Diagnostic Interview (CIDI 3.0). Sociodemographic variables, prior history of any mental disorder and earthquake-related stressors were entered as independent variables in a logistic regression analysis. Findings A total number of 412 participants (response rate: 71%) were interviewed. Significant differences in 12-month prevalence of mental disorders were found in Lorca compared to the rest of Murcia for any (12.8% vs 16.8%), PTSD (3.6% vs 0.5%) and other anxiety disorders (5.3% vs 9.2%) (p≤ 0.05 for all). No differences were found for 12-month prevalence of any mood or any substance disorder. The two major predictors for developing a 12-month post-earthquake mental disorder were a prior mental disorder and the level of exposure. Other risk factors included female sex and low-average income. Conclusions PTSD and other mental disorders are commonly associated with earthquake disasters. Prior mental disorders and the level of exposure to the earthquakes are the most important for the development of a consequent mental disorder and this recognition may help to identify those individuals that may most benefit from specific therapeutic intervention. PMID:28723949

2. Post-Traumatic Stress Disorder and other mental disorders in the general population after Lorca's earthquakes, 2011 (Murcia, Spain): A cross-sectional study.

PubMed

Navarro-Mateu, Fernando; Salmerón, Diego; Vilagut, Gemma; Tormo, Mª José; Ruíz-Merino, Guadalupe; Escámez, Teresa; Júdez, Javier; Martínez, Salvador; Koenen, Karestan C; Navarro, Carmen; Alonso, Jordi; Kessler, Ronald C

2017-01-01

To describe the prevalence and severity of mental disorders and to examine differences in risk among those with and without a lifetime history prior to a moderate magnitude earthquake that took place in Lorca (Murcia, Spain) at roughly the mid-point (on May 11, 2011) of the time interval in which a regional epidemiological survey was already being carried out (June 2010 -May 2012). The PEGASUS-Murcia project is a cross-sectional face-to-face interview survey of a representative sample of non-institutionalized adults in Murcia. Main outcome measures are prevalence and severity of anxiety, mood, impulse and substance disorders in the 12 months previous to the survey, assessed using the Composite International Diagnostic Interview (CIDI 3.0). Sociodemographic variables, prior history of any mental disorder and earthquake-related stressors were entered as independent variables in a logistic regression analysis. A total number of 412 participants (response rate: 71%) were interviewed. Significant differences in 12-month prevalence of mental disorders were found in Lorca compared to the rest of Murcia for any (12.8% vs 16.8%), PTSD (3.6% vs 0.5%) and other anxiety disorders (5.3% vs 9.2%) (p≤ 0.05 for all). No differences were found for 12-month prevalence of any mood or any substance disorder. The two major predictors for developing a 12-month post-earthquake mental disorder were a prior mental disorder and the level of exposure. Other risk factors included female sex and low-average income. PTSD and other mental disorders are commonly associated with earthquake disasters. Prior mental disorders and the level of exposure to the earthquakes are the most important for the development of a consequent mental disorder and this recognition may help to identify those individuals that may most benefit from specific therapeutic intervention.

3. The relationship between quality of life and psychiatric impairment for a Taiwanese community post-earthquake.

PubMed

Choul, F H-C; Chou, P; Lin, C; Su, Tom T-P; Ou-Yang, W-C; Chien, I C; Su, C-Y; Lui, M-K; Chen, M-C

2004-08-01

This purpose of this study was to investigate the relationship between quality of life and psychiatric impairment in a Taiwanese community located near the epicenter of the 1999 earthquake, as assessed four to six months after the natural catastrophe. Trained assistants interviewed the 4223 respondents using the disaster-related psychological screening test (DRPST), an instrument specifically designed and validated by senior psychiatrists for assessment of psychiatric impairment after natural catastrophe. Additionally, the 36-Item Short-Form Health Survey (SF-36) was used to evaluate quality of life. The collected results were analyzed using Windows SPSS 10.0 software. Psychiatric impairment rated moderate to severe was assessed for 1448 (34.3%) of the responding residents. The 4223 respondents were divided into 4 psychiatric-impairment groups based on DPRST score: healthy (n = 952); mild impairment (n = 1823); moderate impairment (n = 1126); and, severe impairment (n = 322). The four groups were compared for a number of salient factors, including gender, age, current marital status and psychiatric-impairment score, to determine impact on quality of life. Respondents assessed as psychiatrically impaired tended to be older, female, divorced/widowed, and less educated, and they were more likely to have experienced major familial financial loss as an immediate consequence of the earthquake. Further, the greater the severity of the psychiatric impairment, the lower the scores for quality of life, for both the physical and mental aspects of this important general indicator.

4. Business closure and relocation: a comparative analysis of the Loma Prieta earthquake and Hurricane Andrew.

PubMed

Wasileski, Gabriela; Rodríguez, Havidán; Diaz, Walter

2011-01-01

The occurrence of a number of large-scale disasters or catastrophes in recent years, including the Indian Ocean tsunami (2004), the Kashmir earthquake (2005), Hurricane Katrina (2005) and Hurricane Ike (2008), have raised our awareness regarding the devastating effects of disasters on human populations and the importance of developing mitigation and preparedness strategies to limit the consequences of such events. However, there is still a dearth of social science research focusing on the socio-economic impact of disasters on businesses in the United States. This paper contributes to this research literature by focusing on the impact of disasters on business closure and relocation through the use of multivariate logistic regression models, specifically focusing on the Loma Prieta earthquake (1989) and Hurricane Andrew (1992). Using a multivariate model, we examine how physical damage to the infrastructure, lifeline disruption and business characteristics, among others, impact business closure and relocation following major disasters. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

5. Laboratory constraints on models of earthquake recurrence

Beeler, Nicholas M.; Tullis, Terry; Junger, Jenni; Kilgore, Brian D.; Goldsby, David L.

2014-01-01

In this study, rock friction ‘stick-slip’ experiments are used to develop constraints on models of earthquake recurrence. Constant-rate loading of bare rock surfaces in high quality experiments produces stick-slip recurrence that is periodic at least to second order. When the loading rate is varied, recurrence is approximately inversely proportional to loading rate. These laboratory events initiate due to a slip rate-dependent process that also determines the size of the stress drop [Dieterich, 1979; Ruina, 1983] and as a consequence, stress drop varies weakly but systematically with loading rate [e.g., Gu and Wong, 1991; Karner and Marone, 2000; McLaskey et al., 2012]. This is especially evident in experiments where the loading rate is changed by orders of magnitude, as is thought to be the loading condition of naturally occurring, small repeating earthquakes driven by afterslip, or low-frequency earthquakes loaded by episodic slip. As follows from the previous studies referred to above, experimentally observed stress drops are well described by a logarithmic dependence on recurrence interval that can be cast as a non-linear slip-predictable model. The fault’s rate dependence of strength is the key physical parameter. Additionally, even at constant loading rate the most reproducible laboratory recurrence is not exactly periodic, unlike existing friction recurrence models. We present example laboratory catalogs that document the variance and show that in large catalogs, even at constant loading rate, stress drop and recurrence co-vary systematically. The origin of this covariance is largely consistent with variability of the dependence of fault strength on slip rate. Laboratory catalogs show aspects of both slip and time predictability and successive stress drops are strongly correlated indicating a ‘memory’ of prior slip history that extends over at least one recurrence cycle.

6. Laboratory constraints on models of earthquake recurrence

NASA Astrophysics Data System (ADS)

Beeler, N. M.; Tullis, Terry; Junger, Jenni; Kilgore, Brian; Goldsby, David

2014-12-01

In this study, rock friction "stick-slip" experiments are used to develop constraints on models of earthquake recurrence. Constant rate loading of bare rock surfaces in high-quality experiments produces stick-slip recurrence that is periodic at least to second order. When the loading rate is varied, recurrence is approximately inversely proportional to loading rate. These laboratory events initiate due to a slip-rate-dependent process that also determines the size of the stress drop and, as a consequence, stress drop varies weakly but systematically with loading rate. This is especially evident in experiments where the loading rate is changed by orders of magnitude, as is thought to be the loading condition of naturally occurring, small repeating earthquakes driven by afterslip, or low-frequency earthquakes loaded by episodic slip. The experimentally observed stress drops are well described by a logarithmic dependence on recurrence interval that can be cast as a nonlinear slip predictable model. The fault's rate dependence of strength is the key physical parameter. Additionally, even at constant loading rate the most reproducible laboratory recurrence is not exactly periodic, unlike existing friction recurrence models. We present example laboratory catalogs that document the variance and show that in large catalogs, even at constant loading rate, stress drop and recurrence covary systematically. The origin of this covariance is largely consistent with variability of the dependence of fault strength on slip rate. Laboratory catalogs show aspects of both slip and time predictability, and successive stress drops are strongly correlated indicating a "memory" of prior slip history that extends over at least one recurrence cycle.

7. Source discrimination between Mining blasts and Earthquakes in Tianshan orogenic belt, NW China

NASA Astrophysics Data System (ADS)

Tang, L.; Zhang, M.; Wen, L.

2017-12-01

In recent years, a large number of quarry blasts have been detonated in Tianshan Mountains of China. It is necessary to discriminate those non-earthquake records from the earthquake catalogs in order to determine the real seismicity of the region. In this study, we have investigated spectral ratios and amplitude ratios as discriminants for regional seismic-event identification using explosions and earthquakes recorded at Xinjiang Seismic Network (XJSN) of China. We used a data set that includes 1071 earthquakes and 2881 non-earthquakes as training data recorded by the XJSN between years of 2009 and 2016, with both types of events in a comparable local magnitude range (1.5 to 2.9). The non-earthquake and earthquake groups were well separated by amplitude ratios of Pg/Sg, with the separation increasing with frequency when averaged over three stations. The 8- to 15-Hz Pg/Sg ratio was proved to be the most precise and accurate discriminant, which works for more than 90% of the events. In contrast, the P spectral ratio performed considerably worse with a significant overlap (about 60% overlap) between the earthquake and explosion populations. The comparison results show amplitude ratios between compressional and shear waves discriminate better than low-frequency to high-frequency spectral ratios for individual phases. In discriminating between explosions and earthquakes, none of two discriminants were able to completely separate the two populations of events. However, a joint discrimination scheme employing simple majority voting reduces misclassifications to 10%. In the region of the study, 44% of the examined seismic events were determined to be non-earthquakes and 55% to be earthquakes. The earthquakes occurring on land are related to small faults, while the blasts are concentrated in large quarries.

8. Lessons learned from the 2016 Kumamoto earthquake: Building damages and behavior of seismically isolated buildings

NASA Astrophysics Data System (ADS)

Morita, Keiko; Takayama, Mineo

2017-10-01

Powerful earthquakes stuck Kumamoto and Oita Prefectures in Kyushu, Japan. It began with the Magnitude 6.5 foreshock at 21:26 JST 14 April, followed by the Magnitude 7.3 mainshock at 1:25 JST 16 April, 2016. The sequence earthquakes also involved more than 1700 perceptible earthquakes as of 13 June. The entire sequence was named the 2016 Kumamoto earthquake by the Japan Meteorological Agency. Thousands of buildings and many roads were damaged, and landslides occurred. The Japanese building standard law is revised in 1981. Structural damages were concentrated on buildings constructed prior to 1981. The area of Mashiki and Southern Aso were most badly affected, especially wooden houses extremely damaged. In Japan, Prof. Hideyuki Tada (title at the time) undertook research on laminated rubber bearings in 1978, and put it into practical use in 1981. The single family house at Yachiyodai, Chiba Prefecture is completed in 1983, it's the first seismically isolated building which is installed laminated rubber bearings in Japan. Afterward, this system is gradually adopted to mainly office buildings, like a research laboratory, a hospital, a computer center and other offices. In the 1994 Northridge earthquake, the 1995 Kobe earthquake and 2011 Tohoku earthquake, seismically isolated buildings improve these good performances, and recently number of the buildings have increased, mainly high risk area of earthquakes. Many people believed that Kumamoto was a low risk area. But there were 24 seismically isolated buildings in Kumamoto Prefecture at the time. The seismically isolated buildings indicated excellent performances during the earthquakes. They protected people, buildings and other important facilities from damages caused by the earthquake. The purpose of this paper is to discuss lessons learned from the 2016 Kumamoto earthquake and behavior of seismically isolated buildings in the earthquake.

9. Unraveling earthquake stresses: Insights from dynamically triggered and induced earthquakes

NASA Astrophysics Data System (ADS)

Velasco, A. A.; Alfaro-Diaz, R. A.

2017-12-01

Induced seismicity, earthquakes caused by anthropogenic activity, has more than doubled in the last several years resulting from practices related to oil and gas production. Furthermore, large earthquakes have been shown to promote the triggering of other events within two fault lengths (static triggering), due to static stresses caused by physical movement along the fault, and also remotely from the passage of seismic waves (dynamic triggering). Thus, in order to understand the mechanisms for earthquake failure, we investigate regions where natural, induced, and dynamically triggered events occur, and specifically target Oklahoma. We first analyze data from EarthScope's USArray Transportable Array (TA) and local seismic networks implementing an optimized (STA/LTA) detector in order to develop local detection and earthquake catalogs. After we identify triggered events through statistical analysis, and perform a stress analysis to gain insight on the stress-states leading to triggered earthquake failure. We use our observations to determine the role of different transient stresses in contributing to natural and induced seismicity by comparing these stresses to regional stress orientation. We also delineate critically stressed regions of triggered seismicity that may indicate areas susceptible to earthquake hazards associated with sustained fluid injection in provinces of induced seismicity. Anthropogenic injection and extraction activity can alter the stress state and fluid flow within production basins. By analyzing the stress release of these ancient faults caused by dynamic stresses, we may be able to determine if fluids are solely responsible for increased seismic activity in induced regions.

10. An integrated analysis on source parameters, seismogenic structure and seismic hazard of the 2014 Ms 6.3 Kangding earthquake

NASA Astrophysics Data System (ADS)

Zheng, Y.

2016-12-01

On November 22, 2014, the Ms6.3 Kangding earthquake ended 30 years of history of no strong earthquake at the Xianshuihe fault zone. The focal mechanism and centroid depth of the Kangding earthquake are inverted by teleseismic waveforms and regional seismograms with CAP method. The result shows that the two nodal planes of focal mechanism are 235°/82°/-173° and 144°/83°/-8° respectively, the latter nodal plane should be the ruptured fault plane with a focal depth of 9 km. The rupture process model of the Kangding earthquake is obtained by joint inversion of teleseismic data and regional seismograms. The Kangding earthquake is a bilateral earthquake, and the major rupture zone is within a depth range of 5-15 km, spanning 10 km and 12 km along dip and strike directions, and maximum slip is about 0.5m. Most seismic moment was released during the first 5 s and the magnitude is Mw6.01, smaller than the model determined by InSAR data. The discrepancy between co-seismic rupture models of the Kangding and its Ms 5.8 aftershock and the InSAR model implies significant afterslip deformation occurred in the two weeks after the mainshock. The afterslip released energy equals to an Mw5.9 earthquake and mainly concentrates in the northwest side and the shallower side to the rupture zone. The CFS accumulation near the epicenter of the 2014 Kangding earthquake is increased by the 2008 Wenchuan earthquake, implying that the Kangding earthquake could be triggered by the Wenchuan earthquake. The CFS at the northwest section of the seismic gap along the Kangding-daofu segment is increased by the Kanding earthquake, and the rupture slip of the Kangding earthquake sequence is too small to release the accumulated strain in the seismic gap. Consequently, the northwest section of the Kangding-daofu seismic gap is under high seismic hazard in the future.

11. Sociological aspects of earthquake prediction

Spall, H.

1979-01-01

Henry Spall talked recently with Denis Mileti who is in the Department of Sociology, Colorado State University, Fort Collins, Colo. Dr. Mileti is a sociologst involved with research programs that study the socioeconomic impact of earthquake prediction.

12. The next new Madrid earthquake

SciT

Atkinson, W.

1988-01-01

Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less

13. Centrality in earthquake multiplex networks

NASA Astrophysics Data System (ADS)

Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

2018-06-01

Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

14. The nature of earthquake prediction

Lindh, A.G.

1991-01-01

Earthquake prediction is inherently statistical. Although some people continue to think of earthquake prediction as the specification of the time, place, and magnitude of a future earthquake, it has been clear for at least a decade that this is an unrealistic and unreasonable definition. the reality is that earthquake prediction starts from the long-term forecasts of place and magnitude, with very approximate time constraints, and progresses, at least in principle, to a gradual narrowing of the time window as data and understanding permit. Primitive long-term forecasts are clearly possible at this time on a few well-characterized fault systems. Tightly focuses monitoring experiments aimed at short-term prediction are already underway in Parkfield, California, and in the Tokai region in Japan; only time will tell how much progress will be possible.

15. Earthquakes, May-June, 1992

Person, Waverly J.

1992-01-01

The months of May and June were very active in terms of earthquake occurrence. Six major earthquakes (7.0earthquakes included a magnitude 7.1 in Papua New Guinea on May 15, a magnitude 7.1 followed by a magnitude 7.5 in the Philippine Islands on May 17, a magnitude 7.0 in the Cuba region on May 25, and a magnitude 7.3 in the Santa Cruz Islands of the Pacific on May 27. In the United States, a magnitude 7.6 earthquake struck in southern California on June 28 followed by a magnitude 6.7 quake about three hours later.

16. Geochemical challenge to earthquake prediction.

PubMed Central

Wakita, H

1996-01-01

The current status of geochemical and groundwater observations for earthquake prediction in Japan is described. The development of the observations is discussed in relation to the progress of the earthquake prediction program in Japan. Three major findings obtained from our recent studies are outlined. (i) Long-term radon observation data over 18 years at the SKE (Suikoen) well indicate that the anomalous radon change before the 1978 Izu-Oshima-kinkai earthquake can with high probability be attributed to precursory changes. (ii) It is proposed that certain sensitive wells exist which have the potential to detect precursory changes. (iii) The appearance and nonappearance of coseismic radon drops at the KSM (Kashima) well reflect changes in the regional stress state of an observation area. In addition, some preliminary results of chemical changes of groundwater prior to the 1995 Kobe (Hyogo-ken nanbu) earthquake are presented. PMID:11607665

17. Earthquakes, September-October, 1979

Person, W.J.

1980-01-01

In the United States, California experienced the strongest earthquake in that State since 1971. The quake, a M=6.8, occurred on October 15, in Baja California, Mexico, near the California border and caused injuries and damage.

18. Earthquakes in Stable Continental Crust.

ERIC Educational Resources Information Center

Johnston, Arch C.; Kanter, Lisa R.

1990-01-01

Discussed are some of the reasons for earthquakes which occur in stable crust away from familiar zones at the ends of tectonic plates. Crust stability and the reactivation of old faults are described using examples from India and Australia. (CW)

19. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

NASA Astrophysics Data System (ADS)

Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

2012-12-01

The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

20. Mitigating earthquakes; the federal role

Press, F.

1977-01-01

With rapid approach of a capability to make reliable earthquake forecasts, it essential that the Federal Government play a strong, positive role in formulating and implementing plans to reduce earthquake hazards. Many steps are being taken in this direction, with the President looking to the Office of Science and Technology Policy (OSTP) in his Executive Office to provide leadership in establishing and coordinating Federal activities.

1. Post-earthquake dilatancy recovery

NASA Technical Reports Server (NTRS)

Scholz, C. H.

1974-01-01

Geodetic measurements of the 1964 Niigata, Japan earthquake and of three other examples are briefly examined. They show exponentially decaying subsidence for a year after the quakes. The observations confirm the dilatancy-fluid diffusion model of earthquake precursors and clarify the extent and properties of the dilatant zone. An analysis using one-dimensional consolidation theory is included which agrees well with this interpretation.

2. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

NASA Astrophysics Data System (ADS)

Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

2017-12-01

Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

3. Facts about the Eastern Japan Great Earthquake of March 2011

NASA Astrophysics Data System (ADS)

Moriyama, T.

2011-12-01

The 2011 great earthquake was a magnitude 9.0 Mw undersea megathrust earthquake off the coast of Japan that occurred early morning UTC on Friday, 11 March 2011, with the epicenter approximately 70 kilometres east of the Oshika Peninsula of Tohoku and the hypocenter at an underwater depth of approximately 32 km. It was the most powerful known earthquake to have hit Japan, and one of the five most powerful earthquakes in the world overall since modern record keeping began in 1900. The earthquake triggered extremely destructive tsunami waves of up to 38.9 metres that struck Tohoku Japan, in some cases traveling up to 10 km inland. In addition to loss of life and destruction of infrastructure, the tsunami caused a number of nuclear accidents, primarily the ongoing level 7 meltdowns at three reactors in the Fukushima I Nuclear Power Plant complex, and the associated evacuation zones affecting hundreds of thousands of residents. The Japanese National Police Agency has confirmed 1,5457 deaths, 5,389 injured, and 7,676 people missing across eighteen prefectures, as well as over 125,000 buildings damaged or destroyed. JAXA carried out ALOS emergency observation just after the earthquake occured, and acquired more than 400 scenes over the disaster area. The coseismic interferogram by InSAR analysis cleary showing the epicenter of the earthquake and land surface deformation over Tohoku area. By comparison of before and after satellite images, the large scale damaged area by tunami are extracted. These images and data can access via JAXA website and also GEO Tohoku oki event supersite website.

4. Research in seismology and earthquake engineering in Venezuela

Urbina, L.; Grases, J.

1983-01-01

After the July 29, 1967, damaging earthquake (with a moderate magnitude of 6.3) caused widespread damage to the northern coastal area of Venezuela and to the Caracas Valley, the Venezuelan Government decided to establish a Presidential Earthquake Commission. This commission undertook the task of coordinating the efforts to study the after-effects of the earthquake. The July 1967 earthquake claimed numerous lives and caused extensive damage to the capital of Venezuela. In 1968, the U.S Geological Survey conducted a seismological field study in the northern coastal area and in the Caracas Valley of Venezuela. the objective was to study the area that sustained severe, moderate, and no damage to structures. A reported entitled Ground Amplification Studies in Earthquake Damage Areas: The Caracas Earthquake of 1967 documented, for the first time, short-period seismic wave ground-motion amplifications in the Caracas Valley. Figure 1 shows the area of severe damage in the Los Palos Grantes suburb and the correlation with depth of alluvium and the arabic numbers denote the ground amplification factor at each site in the area. the Venezuelan Government initiated many programs to study in detail the damage sustained and to investigate the ongoing construction practices. These actions motivated professionals in the academic, private, and Government sectors to develops further capabilities and self-sufficiency in the fields of engineering and seismology. Allocation of funds was made to assist in training professionals and technicians and in developing new seismological stations and new programs at the national level in earthquake engineering and seismology. A brief description of the ongoing programs in Venezuela is listed below. these programs are being performed by FUNVISIS and by other national organizations listed at the end of this article.

5. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

NASA Astrophysics Data System (ADS)

Whittier, J. C.; Nittel, S.; Subasinghe, I.

2017-10-01

With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

6. A moment in time: emergency nurses and the Canterbury earthquakes.

PubMed

Richardson, S; Ardagh, M; Grainger, P; Robinson, V

2013-06-01

To outline the impact of the Canterbury, New Zealand (NZ) earthquakes on Christchurch Hospital, and the experiences of emergency nurses during this time. NZ has experienced earthquakes and aftershocks centred in the Canterbury region of the South Island. The location of these, around and within the major city of Christchurch, was unexpected and associated with previously unknown fault lines. While the highest magnitude quake occurred in September 2010, registering 7.1 on the Richter scale, it was the magnitude 6.3 event on 22 February 2011 which was associated with the greatest injury burden and loss of life. Staff working in the only emergency department in the city were faced with an external emergency while also being directly affected as part of the disaster. SOURCES OF EVIDENCE: This paper developed following interviews with nurses who worked during this period, and draws on literature related to healthcare responses to earthquakes and natural disasters. The establishment of an injury database allowed for an accurate picture to emerge of the injury burden, and each of the authors was present and worked in a clinical capacity during the earthquake. Nurses played a significant role in the response to the earthquakes and its aftermath. However, little is known regarding the impact of this, either in personal or professional terms. This paper presents an overview of the earthquakes and experiences of nurses working during this time, identifying a range of issues that will benefit from further exploration and research. It seeks to provide a sense of the experiences and the potential meanings that were derived from being part of this 'moment in time'. Examples of innovations in practice emerged during the earthquake response and a number of recommendations for nursing practice are identified. © 2013 The Authors. International Nursing Review © 2013 International Council of Nurses.

7. Large earthquakes and creeping faults

Harris, Ruth A.

2017-01-01

Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

8. On the Diurnal Periodicity of Representative Earthquakes in Greece: Comparison of Data from Different Observation Systems

NASA Astrophysics Data System (ADS)

Desherevskii, A. V.; Sidorin, A. Ya.

2017-12-01

Due to the initiation of the Hellenic Unified Seismic Network (HUSN) in late 2007, the quality of observation significantly improved by 2011. For example, the representative magnitude level considerably has decreased and the number of annually recorded events has increased. The new observational system highly expanded the possibilities for studying regularities in seismicity. In view of this, the authors revisited their studies of the diurnal periodicity of representative earthquakes in Greece that was revealed earlier in the earthquake catalog before 2011. We use 18 samples of earthquakes of different magnitudes taken from the catalog of Greek earthquakes from 2011 to June 2016 to derive a series of the number of earthquakes for each of them and calculate its average diurnal course. To increase the reliability of the results, we compared the data for two regions. With a high degree of statistical significance, we have obtained that no diurnal periodicity can be found for strongly representative earthquakes. This finding differs from the estimates obtained earlier from an analysis of the catalog of earthquakes at the same area for 1995-2004 and 2005-2010, i.e., before the initiation of the Hellenic Unified Seismic Network. The new results are consistent with the hypothesis of noise discrimination (observational selection) explaining the cause of the diurnal variation of earthquakes with different sensitivity of the seismic network in daytime and nighttime periods.

9. Childhood Obesity Causes & Consequences

MedlinePlus

... and Local Programs Related Topics Diabetes Nutrition Childhood Obesity Causes & Consequences Recommend on Facebook Tweet Share Compartir ... determine how a community is designed. Consequences of Obesity More Immediate Health Risks Obesity during childhood can ...

10. Frictional heating processes during laboratory earthquakes

NASA Astrophysics Data System (ADS)

Aubry, J.; Passelegue, F. X.; Deldicque, D.; Lahfid, A.; Girault, F.; Pinquier, Y.; Escartin, J.; Schubnel, A.

2017-12-01

Frictional heating during seismic slip plays a crucial role in the dynamic of earthquakes because it controls fault weakening. This study proposes (i) to image frictional heating combining an in-situ carbon thermometer and Raman microspectrometric mapping, (ii) to combine these observations with fault surface roughness and heat production, (iii) to estimate the mechanical energy dissipated during laboratory earthquakes. Laboratory earthquakes were performed in a triaxial oil loading press, at 45, 90 and 180 MPa of confining pressure by using saw-cut samples of Westerly granite. Initial topography of the fault surface was +/- 30 microns. We use a carbon layer as a local temperature tracer on the fault plane and a type K thermocouple to measure temperature approximately 6mm away from the fault surface. The thermocouple measures the bulk temperature of the fault plane while the in-situ carbon thermometer images the temperature production heterogeneity at the micro-scale. Raman microspectrometry on amorphous carbon patch allowed mapping the temperature heterogeneities on the fault surface after sliding overlaid over a few micrometers to the final fault roughness. The maximum temperature achieved during laboratory earthquakes remains high for all experiments but generally increases with the confining pressure. In addition, the melted surface of fault during seismic slip increases drastically with confining pressure. While melting is systematically observed, the strength drop increases with confining pressure. These results suggest that the dynamic friction coefficient is a function of the area of the fault melted during stick-slip. Using the thermocouple, we inverted the heat dissipated during each event. We show that for rough faults under low confining pressure, less than 20% of the total mechanical work is dissipated into heat. The ratio of frictional heating vs. total mechanical work decreases with cumulated slip (i.e. number of events), and decreases with

11. Lake deposits record evidence of large post-1505 AD earthquakes in western Nepal

NASA Astrophysics Data System (ADS)

Ghazoui, Z.; Bertrand, S.; Vanneste, K.; Yokoyama, Y.; Van Der Beek, P.; Nomade, J.; Gajurel, A.

2016-12-01

According to historical records, the last large earthquake that ruptured the Main Frontal Thrust (MFT) in western Nepal occurred in 1505 AD. Since then, no evidence of other large earthquakes has been found in historical records or geological archives. In view of the catastrophic consequences to millions of inhabitants of Nepal and northern India, intense efforts currently focus on improving our understanding of past earthquake activity and complement the historical data on Himalayan earthquakes. Here we report a new record, based on earthquake-triggered turbidites in lakes. We use lake sediment records from Lake Rara, western Nepal, to reconstruct the occurrence of seismic events. The sediment cores were studied using a multi-proxy approach combining radiocarbon and 210Pb chronologies, physical properties (X-ray computerized axial tomography scan, Geotek multi-sensor core logger), high-resolution grain size, inorganic geochemistry (major elements by ITRAX XRF core scanning) and bulk organic geochemistry (C, N concentrations and stable isotopes). We identified several sequences of dense and layered fine sand mainly composed of mica, which we interpret as earthquake-triggered turbidites. Our results suggest the presence of a synchronous event between the two lake sites correlated with the well-known 1505 AD earthquake. In addition, our sediment records reveal five earthquake-triggered turbidites younger than the 1505 AD event. By comparison with historical archives, we relate one of those to the 1833 AD MFT rupture. The others may reflect successive ruptures of the Western Nepal Fault System. Our study sheds light on events that have not been recorded in historical chronicles. Those five MMI>7 earthquakes permit addressing the problem of missing slip on the MFT in western Nepal and reevaluating the risk of a large earthquake affecting western Nepal and North India.

12. The Electronic Encyclopedia of Earthquakes

NASA Astrophysics Data System (ADS)

Benthien, M.; Marquis, J.; Jordan, T.

2003-12-01

The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

13. Earthquake Potential Models for China

NASA Astrophysics Data System (ADS)

Rong, Y.; Jackson, D. D.

2002-12-01

We present three earthquake potential estimates for magnitude 5.4 and larger earthquakes for China. The potential is expressed as the rate density (probability per unit area, magnitude and time). The three methods employ smoothed seismicity-, geologic slip rate-, and geodetic strain rate data. We tested all three estimates, and the published Global Seismic Hazard Assessment Project (GSHAP) model, against earthquake data. We constructed a special earthquake catalog which combines previous catalogs covering different times. We used the special catalog to construct our smoothed seismicity model and to evaluate all models retrospectively. All our models employ a modified Gutenberg-Richter magnitude distribution with three parameters: a multiplicative ``a-value," the slope or ``b-value," and a ``corner magnitude" marking a strong decrease of earthquake rate with magnitude. We assumed the b-value to be constant for the whole study area and estimated the other parameters from regional or local geophysical data. The smoothed seismicity method assumes that the rate density is proportional to the magnitude of past earthquakes and approximately as the reciprocal of the epicentral distance out to a few hundred kilometers. We derived the upper magnitude limit from the special catalog and estimated local a-values from smoothed seismicity. Earthquakes since January 1, 2000 are quite compatible with the model. For the geologic forecast we adopted the seismic source zones (based on geological, geodetic and seismicity data) of the GSHAP model. For each zone, we estimated a corner magnitude by applying the Wells and Coppersmith [1994] relationship to the longest fault in the zone, and we determined the a-value from fault slip rates and an assumed locking depth. The geological model fits the earthquake data better than the GSHAP model. We also applied the Wells and Coppersmith relationship to individual faults, but the results conflicted with the earthquake record. For our geodetic

14. Accounting for orphaned aftershocks in the earthquake background rate

NASA Astrophysics Data System (ADS)

van der Elst, Nicholas J.

2017-11-01

Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

15. Accounting for orphaned aftershocks in the earthquake background rate

Van Der Elst, Nicholas

2017-01-01

Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

16. SLAMMER: Seismic LAndslide Movement Modeled using Earthquake Records

Jibson, Randall W.; Rathje, Ellen M.; Jibson, Matthew W.; Lee, Yong W.

2013-01-01

This program is designed to facilitate conducting sliding-block analysis (also called permanent-deformation analysis) of slopes in order to estimate slope behavior during earthquakes. The program allows selection from among more than 2,100 strong-motion records from 28 earthquakes and allows users to add their own records to the collection. Any number of earthquake records can be selected using a search interface that selects records based on desired properties. Sliding-block analyses, using any combination of rigid-block (Newmark), decoupled, and fully coupled methods, are then conducted on the selected group of records, and results are compiled in both graphical and tabular form. Simplified methods for conducting each type of analysis are also included.

17. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

PubMed

Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

2012-01-01

Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

18. Volunteers in the earthquake hazard reduction program

Ward, P.L.

1978-01-01

With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers.

19. Computer simulation of earthquakes

NASA Technical Reports Server (NTRS)

Cohen, S. C.

1976-01-01

Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

20. Reconsidering earthquake scaling

Gomberg, Joan S.; Wech, Aaron G.; Creager, Kenneth; Obara, K.; Agnew, Duncan

2016-01-01

The relationship (scaling) between scalar moment, M0, and duration, T, potentially provides key constraints on the physics governing fault slip. The prevailing interpretation of M0-T observations proposes different scaling for fast (earthquakes) and slow (mostly aseismic) slip populations and thus fundamentally different driving mechanisms. We show that a single model of slip events within bounded slip zones may explain nearly all fast and slow slip M0-T observations, and both slip populations have a change in scaling, where the slip area growth changes from 2-D when too small to sense the boundaries to 1-D when large enough to be bounded. We present new fast and slow slip M0-T observations that sample the change in scaling in each population, which are consistent with our interpretation. We suggest that a continuous but bimodal distribution of slip modes exists and M0-T observations alone may not imply a fundamental difference between fast and slow slip.

1. Regional Earthquake Shaking and Loss Estimation

NASA Astrophysics Data System (ADS)

Sesetyan, K.; Demircioglu, M. B.; Zulfikar, C.; Durukal, E.; Erdik, M.

2009-04-01

This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses in the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Both Level 0 (similar to PAGER system of USGS) and Level 1 analyses of the ELER routine are based on obtaining intensity distributions analytically and estimating total number of casualties and their geographic distribution either using regionally adjusted intensity-casualty or magnitude-casualty correlations (Level 0) of using regional building inventory data bases (Level 1). Level 0 analysis is similar to the PAGER system being developed by USGS. For given

2. New insights into earthquake precursors from InSAR.

PubMed

Moro, Marco; Saroli, Michele; Stramondo, Salvatore; Bignami, Christian; Albano, Matteo; Falcucci, Emanuela; Gori, Stefano; Doglioni, Carlo; Polcari, Marco; Tallini, Marco; Macerola, Luca; Novali, Fabrizio; Costantini, Mario; Malvarosa, Fabio; Wegmüller, Urs

2017-09-20

We measured ground displacements before and after the 2009 L'Aquila earthquake using multi-temporal InSAR techniques to identify seismic precursor signals. We estimated the ground deformation and its temporal evolution by exploiting a large dataset of SAR imagery that spans seventy-two months before and sixteen months after the mainshock. These satellite data show that up to 15 mm of subsidence occurred beginning three years before the mainshock. This deformation occurred within two Quaternary basins that are located close to the epicentral area and are filled with sediments hosting multi-layer aquifers. After the earthquake, the same basins experienced up to 12 mm of uplift over approximately nine months. Before the earthquake, the rocks at depth dilated, and fractures opened. Consequently, fluids migrated into the dilated volume, thereby lowering the groundwater table in the carbonate hydrostructures and in the hydrologically connected multi-layer aquifers within the basins. This process caused the elastic consolidation of the fine-grained sediments within the basins, resulting in the detected subsidence. After the earthquake, the fractures closed, and the deep fluids were squeezed out. The pre-seismic ground displacements were then recovered because the groundwater table rose and natural recharge of the shallow multi-layer aquifers occurred, which caused the observed uplift.

3. Initiatives to Reduce Earthquake Risk of Developing Countries

NASA Astrophysics Data System (ADS)

Tucker, B. E.

2008-12-01

The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of

4. Laboratory generated M -6 earthquakes

McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

2014-01-01

We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

5. Security Implications of Induced Earthquakes

NASA Astrophysics Data System (ADS)

Jha, B.; Rao, A.

2016-12-01

The increase in earthquakes induced or triggered by human activities motivates us to research how a malicious entity could weaponize earthquakes to cause damage. Specifically, we explore the feasibility of controlling the location, timing and magnitude of an earthquake by activating a fault via injection and production of fluids into the subsurface. Here, we investigate the relationship between the magnitude and trigger time of an induced earthquake to the well-to-fault distance. The relationship between magnitude and distance is important to determine the farthest striking distance from which one could intentionally activate a fault to cause certain level of damage. We use our novel computational framework to model the coupled multi-physics processes of fluid flow and fault poromechanics. We use synthetic models representative of the New Madrid Seismic Zone and the San Andreas Fault Zone to assess the risk in the continental US. We fix injection and production flow rates of the wells and vary their locations. We simulate injection-induced Coulomb destabilization of faults and evolution of fault slip under quasi-static deformation. We find that the effect of distance on the magnitude and trigger time is monotonic, nonlinear, and time-dependent. Evolution of the maximum Coulomb stress on the fault provides insights into the effect of the distance on rupture nucleation and propagation. The damage potential of induced earthquakes can be maintained even at longer distances because of the balance between pressure diffusion and poroelastic stress transfer mechanisms. We conclude that computational modeling of induced earthquakes allows us to measure feasibility of weaponzing earthquakes and developing effective defense mechanisms against such attacks.

6. Earthquake source properties from pseudotachylite

Beeler, Nicholas M.; Di Toro, Giulio; Nielsen, Stefan

2016-01-01

The motions radiated from an earthquake contain information that can be interpreted as displacements within the source and therefore related to stress drop. Except in a few notable cases, the source displacements can neither be easily related to the absolute stress level or fault strength, nor attributed to a particular physical mechanism. In contrast paleo-earthquakes recorded by exhumed pseudotachylite have a known dynamic mechanism whose properties constrain the co-seismic fault strength. Pseudotachylite can also be used to directly address a longstanding discrepancy between seismologically measured static stress drops, which are typically a few MPa, and much larger dynamic stress drops expected from thermal weakening during localized slip at seismic speeds in crystalline rock [Sibson, 1973; McKenzie and Brune, 1969; Lachenbruch, 1980; Mase and Smith, 1986; Rice, 2006] as have been observed recently in laboratory experiments at high slip rates [Di Toro et al., 2006a]. This note places pseudotachylite-derived estimates of fault strength and inferred stress levels within the context and broader bounds of naturally observed earthquake source parameters: apparent stress, stress drop, and overshoot, including consideration of roughness of the fault surface, off-fault damage, fracture energy, and the 'strength excess'. The analysis, which assumes stress drop is related to corner frequency by the Madariaga [1976] source model, is restricted to the intermediate sized earthquakes of the Gole Larghe fault zone in the Italian Alps where the dynamic shear strength is well-constrained by field and laboratory measurements. We find that radiated energy exceeds the shear-generated heat and that the maximum strength excess is ~16 MPa. More generally these events have inferred earthquake source parameters that are rate, for instance a few percent of the global earthquake population has stress drops as large, unless: fracture energy is routinely greater than existing models allow

7. Development of Earthquake Emergency Response Plan for Tribhuvan International Airport, Kathmandu, Nepal

DTIC Science & Technology

2013-02-01

Kathmandu, Nepal 5a. CONTRACT NUMBER W911NF-12-1-0282 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER ...In the past, big earthquakes in Nepal (see Figure 1.1) have caused a huge number of casualties and damage to structures. The Great Nepal -Bihar...UBC Earthquake Engineering Research Facility 2235 East Mall, Vancouver, BC, Canada V6T 1Z4 Phone : 604 822-6203 Fax: 604 822-6901 E-mail

8. Hupa Numbers.

ERIC Educational Resources Information Center

Bennett, Ruth, Ed.; And Others

An introduction to the Hupa number system is provided in this workbook, one in a series of numerous materials developed to promote the use of the Hupa language. The book is written in English with Hupa terms used only for the names of numbers. The opening pages present the numbers from 1-10, giving the numeral, the Hupa word, the English word, and…

9. Nurse willingness to report for work in the event of an earthquake in Israel.

PubMed

Ben Natan, Merav; Nigel, Simon; Yevdayev, Innush; Qadan, Mohamad; Dudkiewicz, Mickey

2014-10-01

To examine variables affecting nurse willingness to report for work in the event of an earthquake in Israel and whether this can be predicted through the Theory of Self-Efficacy. The nursing profession has a major role in preparing for earthquakes. Nurse willingness to report to work in the event of an earthquake has never before been examined. Self-administered questionnaires were distributed among a convenience sample of 400 nurses and nursing students in Israel during January-April 2012. High willingness to report to work in the event of an earthquake was declared by 57% of respondents. High perceived self-efficacy, level of knowledge and experience predict willingness to report to work in the event of an earthquake. Multidisciplinary collaboration and support was also cited as a meaningful factor. Perceived self-efficacy, level of knowledge, experience and the support of a multidisciplinary staff affect nurse willingness to report to work in the event of an earthquake. Nurse managers can identify factors that increase nurse willingness to report to work in the event of an earthquake and consequently develop strategies for more efficient management of their nursing workforce. © 2013 John Wiley & Sons Ltd.

10. Assessing Earthquake-Induced Tree Mortality in Temperate Forest Ecosystems: A Case Study from Wenchuan, China

DOE PAGES

Zeng, Hongcheng; Lu, Tao; Jenkins, Hillary; ...

2016-03-17

Earthquakes can produce significant tree mortality, and consequently affect regional carbon dynamics. Unfortunately, detailed studies quantifying the influence of earthquake on forest mortality are currently rare. The committed forest biomass carbon loss associated with the 2008 Wenchuan earthquake in China is assessed by a synthetic approach in this study that integrated field investigation, remote sensing analysis, empirical models and Monte Carlo simulation. The newly developed approach significantly improved the forest disturbance evaluation by quantitatively defining the earthquake impact boundary and detailed field survey to validate the mortality models. Based on our approach, a total biomass carbon of 10.9 Tg·C wasmore » lost in Wenchuan earthquake, which offset 0.23% of the living biomass carbon stock in Chinese forests. Tree mortality was highly clustered at epicenter, and declined rapidly with distance away from the fault zone. It is suggested that earthquakes represent a signif icant driver to forest carbon dynamics, and the earthquake-induced biomass carbon loss should be included in estimating forest carbon budgets.« less

11. Assessing Earthquake-Induced Tree Mortality in Temperate Forest Ecosystems: A Case Study from Wenchuan, China

SciT

Zeng, Hongcheng; Lu, Tao; Jenkins, Hillary

Earthquakes can produce significant tree mortality, and consequently affect regional carbon dynamics. Unfortunately, detailed studies quantifying the influence of earthquake on forest mortality are currently rare. The committed forest biomass carbon loss associated with the 2008 Wenchuan earthquake in China is assessed by a synthetic approach in this study that integrated field investigation, remote sensing analysis, empirical models and Monte Carlo simulation. The newly developed approach significantly improved the forest disturbance evaluation by quantitatively defining the earthquake impact boundary and detailed field survey to validate the mortality models. Based on our approach, a total biomass carbon of 10.9 Tg·C wasmore » lost in Wenchuan earthquake, which offset 0.23% of the living biomass carbon stock in Chinese forests. Tree mortality was highly clustered at epicenter, and declined rapidly with distance away from the fault zone. It is suggested that earthquakes represent a signif icant driver to forest carbon dynamics, and the earthquake-induced biomass carbon loss should be included in estimating forest carbon budgets.« less

12. The music of earthquakes and Earthquake Quartet #1

Michael, Andrew J.

2013-01-01

Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

13. Site Response for Micro-Zonation from Small Earthquakes

NASA Astrophysics Data System (ADS)

Gospe, T. B.; Hutchings, L.; Liou, I. Y. W.; Jarpe, S.

2017-12-01

/V spectral ratios of noise don't provide accurate site response estimates either. Vs30 only provides one amplification number and doesn't account for the variable three-dimensional structure beneath sites. We conclude that absolute site response obtained directly from earthquakes is the best, and possibly, the only way to get accurate site response estimates.

14. Earthquake Early Warning and Public Policy: Opportunities and Challenges

NASA Astrophysics Data System (ADS)

Goltz, J. D.; Bourque, L.; Tierney, K.; Riopelle, D.; Shoaf, K.; Seligson, H.; Flores, P.

2003-12-01

Development of an earthquake early warning capability and pilot project were objectives of TriNet, a 5-year (1997-2001) FEMA-funded project to develop a state-of-the-art digital seismic network in southern California. In parallel with research to assemble a protocol for rapid analysis of earthquake data and transmission of a signal by TriNet scientists and engineers, the public policy, communication and educational issues inherent in implementation of an earthquake early warning system were addressed by TriNet's outreach component. These studies included: 1) a survey that identified potential users of an earthquake early warning system and how an earthquake early warning might be used in responding to an event, 2) a review of warning systems and communication issues associated with other natural hazards and how lessons learned might be applied to an alerting system for earthquakes, 3) an analysis of organization, management and public policy issues that must be addressed if a broad-based warning system is to be developed and 4) a plan to provide earthquake early warnings to a small number of organizations in southern California as an experimental prototype. These studies provided needed insights into the social and cultural environment in which this new technology will be introduced, an environment with opportunities to enhance our response capabilities but also an environment with significant barriers to overcome to achieve a system that can be sustained and supported. In this presentation we will address the main public policy issues that were subjects of analysis in these studies. They include a discussion of the possible division of functions among organizations likely to be the principle partners in the management of an earthquake early warning system. Drawing on lessons learned from warning systems for other hazards, we will review the potential impacts of false alarms and missed events on warning system credibility, the acceptability of fully automated

15. Earthquake precursors from InSAR geodesy: insights from the L'Aquila (Central Italy) April 6, 2009 earthquake

NASA Astrophysics Data System (ADS)

Bignami, C.; Moro, M.; Saroli, M.; Stramondo, S.; Albano, M.; Falcucci, E.; Gori, S.; Doglioni, C.; Polcari, M.; Tallini, M.; Macerola, L.; Novali, F.; Costantini, M.; Malvarosa, F.; Wegmüller, U.

2017-12-01

In modern seismology, the identification of earthquake precursors is one of the most important issue to investigate on. Precursor indicators based on the use of updated and most satellite advanced geodetic techniques such as GPS and SAR interferometry, have not been conclusively identified so far. However, the latest progress in terms of new satellite missions and processing algorithms may bring this goal closer. Here we present evidence of ground deformation signals preceding the 2009 L'Aquila earthquake, which have been observed using multi-temporal InSAR techniques. We exploited a wide dataset from RADARSAT2, ENVISAT and COSMO-SkyMed missions to derive mean velocity and ground acceleration maps of the epicentral area, for a time span of approximately 6 years before the earthquake and about one year after the earthquake. The maps of ground accelerations before the mainshock, have allowed the identification of two peculiar displacement patterns, well localized in two Quaternary basins, close to the focal volume of the seismic event (Mw 6.3) that hit the city of L'Aquila on 6 April 2009. In these two regions, a significant subsidence began approximately three years before the earthquake, reaching a value of about 1.5 cm, and persisted until the earthquake. Conversely, in the post-seismic phase, the two basins showed an uplift, with velocities approximately of 5 to 18 mm/yr. The deep knowledge of the geological, hydrogeological and geotechnical setting of the area has provided a plausible explanation of the observed phenomenon. The two Quaternary basins are filled with sediments that host multi-layer aquifers that are hydrologically connected with the neighbouring carbonatic hydrostructures. Before the earthquake, the rocks at depth have dilated and fractures opened. Consequently, fluids have migrated into the dilated volume causing the lowering the groundwater table in the carbonate hydrostructures and in the hydrologically connected multi-layer aquifers within the

16. Earthquake and Tsunami: a movie and a book for seismic and tsunami risk reduction in Italy.

NASA Astrophysics Data System (ADS)

Nostro, C.; Baroux, E.; Maramai, A.; Graziani, L.; Tertulliani, A.; Castellano, C.; Arcoraci, L.; Casale, P.; Ciaccio, M. G.; Frepoli, A.

2009-04-01

Italy is a country well known for the seismic and volcanic hazard. However, a similarly great hazard, although not well recognized, is posed by the occurrence of tsunami waves along the Italian coastline. This is testified by a rich catalogue and by field evidence of deposits left over by pre- and historical tsunamis, even in places today considered safe. This observation is of great importance since many of the areas affected by tsunamis in the past are today touristic places. The Italian tsunamis can be caused by different sources: 1- off-shore or near coast in-land earthquakes; 2- very large earthquakes on distant sources in the Mediterranean; 3- submarine volcanic explosion in the Tyrrhenian sea; 4- submarine landslides triggered by earthquakes and volcanic activity. The consequence of such a wide spectrum of sources is that an important part of the more than 7000 km long Italian coast line is exposed to the tsunami risk, and thousands of inhabitants (with numbers increasing during summer) live near hazardous coasts. The main historical tsunamis are the 1783 and 1908 events that hit Calabrian and Sicilian coasts. The recent tsunami is that caused by the 2002 Stromboli landslide. In order to reduce this risk and following the emotional impact of the December 2004 Sumatra earthquake and tsunami, we developed an outreach program consisting in talks given by scientists and in a movie and a book, both exploring the causes of the tsunami waves, how do they propagate in deep and shallow waters, and what are the effects on the coasts. Hints are also given on the most dangerous Italian coasts (as deduced by scientific studies), and how to behave in the case of a tsunami approaching the coast. These seminars are open to the general public, but special programs are developed with schools of all grades. In this talk we want to present the book and the movie used during the seminars and scientific expositions, that was realized from a previous 3D version originally

17. Toward real-time regional earthquake simulation of Taiwan earthquakes

NASA Astrophysics Data System (ADS)

Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

2013-12-01

We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

18. Earthquake and Tsunami booklet based on two Indonesia earthquakes

NASA Astrophysics Data System (ADS)

Hayashi, Y.; Aci, M.

2014-12-01

Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

19. The Parkfield earthquake prediction of October 1992; the emergency services response

Andrews, R.

1992-01-01

The science of earthquake prediction is interesting and worthy of support. In many respects the ultimate payoff of earthquake prediction or earthquake forecasting is how the information can be used to enhance public safety and public preparedness. This is a particularly important issue here in California where we have such a high level of seismic risk historically, and currently, as a consequence of activity in 1989 in the San Francisco Bay Area, in Humboldt County in April of this year (1992), and in southern California in the Landers-Big Bear area in late June of this year (1992). We are currently very concerned about the possibility of a major earthquake, one or more, happening close to one of our metropolitan areas. Within that context, the Parkfield experiment becomes very important.

20. Earthquake Archaeology: a logical approach?

NASA Astrophysics Data System (ADS)

Stewart, I. S.; Buck, V. A.

2001-12-01

Ancient earthquakes can leave their mark in the mythical and literary accounts of ancient peoples, the stratigraphy of their site histories, and the structural integrity of their constructions. Within this broad cross-disciplinary tramping ground, earthquake geologists have tended to focus on those aspects of the cultural record that are most familiar to them; the physical effects of seismic deformation on ancient constructions. One of the core difficulties with this 'earthquake archaeology' approach is that recent attempts to isolate structural criteria that are diagnostic or strongly suggestive of a seismic origin are undermined by the recognition that signs of ancient seismicity are generally indistinguishable from non-seismic mechanisms (poor construction, adverse geotechnical conditions). We illustrate the difficulties and inconsistencies in current proposed 'earthquake diagnostic' schemes by reference to two case studies of archaeoseismic damage in central Greece. The first concerns fallen columns at various Classical temple localities in mainland Greece (Nemea, Sounio, Olympia, Bassai) which, on the basis of observed structural criteria, are earthquake-induced but which are alternatively explained by archaeologists as the action of human disturbance. The second re-examines the almost type example of the Kyparissi site in the Atalanti region as a Classical stoa offset across a seismic surface fault, arguing instead for its deformation by ground instability. Finally, in highlighting the inherent ambiguity of archaeoseismic data, we consider the value of a logic-tree approach for quantifying and quantifying our uncertainities for seismic-hazard analysis.

1. Earthquake probabilities in the San Francisco Bay Region: 2000 to 2030 - a summary of findings

,

1999-01-01

region—an innovation over previous studies of the SFBR that considered only a small number of potential earthquakes of fixed magnitude.

2. Evidence for large prehistoric earthquakes in the northern New Madrid Seismic Zone, central United States

Li, Y.; Schweig, E.S.; Tuttle, M.P.; Ellis, M.A.

1998-01-01

We surveyed the area north of New Madris, Missouri, for prehistoric liquefaction deposits and uncovered two new sites with evidence of pre-1811 earthquakes. At one site, located about 20 km northeast of New Madrid, Missouri, radiocarbon dating indicates that an upper sand blow was probably deposited after A.D. 1510 and a lower sand blow was deposited prior to A.D. 1040. A sand blow at another site about 45 km northeast of New Madrid, Missouri, is dated as likely being deposited between A.D.55 and A.D. 1620 and represents the northernmost recognized expression of prehistoric liquefaction likely related to the New Madrid seismic zone. This study, taken together with other data, supports the occurrence of at least two earthquakes strong enough to indcue liquefaction or faulting before A.D. 1811, and after A.D. 400. One earthquake probably occurred around AD 900 and a second earthquake occurred around A.D. 1350. The data are not yet sufficient to estimate the magnitudes of the causative earthquakes for these liquefaction deposits although we conclude that all of the earthquakes are at least moment magnitude M ~6.8, the size of the 1895 Charleston, Missouri, earthquake. A more rigorous estimate of the number and sizes of prehistoric earthquakes in the New Madrid sesmic zone awaits evaluation of additional sites.

3. Medical experience of a university hospital in Turkey after the 1999 Marmara earthquake

PubMed Central

Bulut, M; Fedakar, R; Akkose, S; Akgoz, S; Ozguc, H; Tokyay, R

2005-01-01

Objectives: This study aimed to provide an overview of morbidity and mortality among patients admitted to the Hospital of the Medicine Faculty of Uludag University, Bursa, Turkey, after the 1999 Marmara earthquake. Methods: Retrospective analysis of the medical records of 645 earthquake victims. Patients' demographic data, diagnosis, dispositions, and prognosis were reviewed. Results: A total of 330 patients with earthquake related injuries and illness admitted to our hospital were included and divided into three main groups: crush syndrome (n = 110), vital organ injuries (n = 57), and non-traumatic but earthquake related illness (n = 55). Seventy seven per cent of patients were hospitalised during the first three days after the earthquake. The rate of mortality associated with the crush syndrome, vital organ injury, and non-traumatic medical problems was 21% (23/110), 17.5% (10/57), and 9% (5/55), respectively. The overall mortality rate was 8% (50/645). Conclusions: In the first 24–48 hours after a major earthquake, hospital emergency departments are flooded with large numbers of patients. Among this patient load, those patients with crush syndrome or vital organ injuries are particularly at risk. Proper triage and prompt treatment of these seriously injured earthquake victims may decrease morbidity and mortality. It is hoped that this review of the challenges met after the Marmara earthquake and the lessons learned will be of use to emergency department physicians as well as hospital emergency planners in preparing for future natural disasters. PMID:15983085

4. Unbonded Prestressed Columns for Earthquake Resistance

DOT National Transportation Integrated Search

2012-05-01

Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

5. Earthquakes in Oita triggered by the 2016 M7.3 Kumamoto earthquake

NASA Astrophysics Data System (ADS)

Yoshida, Shingo

2016-11-01

During the passage of the seismic waves from the M7.3 Kumamoto, Kyushu, earthquake on April 16, 2016, a M5.7 [semiofficial value estimated by the Japan Meteorological Agency (JMA)] event occurred in the central part of Oita prefecture, approximately 80 km far away from the mainshock. Although there have been a number of reports that M < 5 earthquakes were remotely triggered during the passage of seismic waves from mainshocks, there has been no evidence for M > 5 triggered events. In this paper, we firstly confirm that this event is a M6-class event by re-estimating the magnitude using the strong-motion records of K-NET and KiK-net, and crustal deformation data at the Yufuin station observed by the Geospatial Information Authority of Japan. Next, by investigating the aftershocks of 45 mainshocks which occurred over the past 20 years based on the JMA earthquake catalog (JMAEC), we found that the delay time of the 2016 M5.7 event in Oita was the shortest. Therefore, the M5.7 event could be regarded as an exceptional M > 5 event that was triggered by passing seismic waves, unlike the usual triggered events and aftershocks. Moreover, a search of the JMAEC shows that in the 2016 Oita aftershock area, swarm earthquake activity was low over the past 30 years compared with neighboring areas. We also found that in the past, probably or possibly triggered events frequently occurred in the 2016 Oita aftershock area. The Oita area readily responds to remote triggering because of high geothermal activity and young volcanism in the area. The M5.7 Oita event was triggered by passing seismic waves, probably because large dynamic stress change was generated by the mainshock at a short distance and because the Oita area was already loaded to a critical stress state without a recent energy release as suggested by the past low swarm activity.[Figure not available: see fulltext.

6. Seven big strike-slip earthquakes

NASA Astrophysics Data System (ADS)

Lohman, R. B.; Simons, M.; Pritchard, M. E.

2003-12-01

We examine seven large (Mw > 7) strike-slip earthquakes that occurred since the beginning of ERS 1 and 2 missions. We invert GPS observations and InSAR interferograms and azimuth offsets for coseismic slip distributions. We explore two refinements to the traditional least-squares inversion technique with roughness constraints. First, we diverge from the usual definition of ``roughness'' as the average roughness over the entire fault plane, and allow ``variable smoothing'' constraints. Variable smoothing allows our inversion to select models that are more complex in regions that are well-resolved by the data, while still damping regions that are poorly resolved. Second, we choose our smoothing parameters using the jR_i criterion. The jR_i criterion draws on the theory behind cross-validation and the bootstrap method. We examine the theoretical basis behind such methods and use an analytical approximation technique for linear problems. We provide maps of model variance and spatial averaging scale over the fault plane, to explicitly show which features in our slip models are robust. We examine the 1992 Landers (CA), 1995 Sakhalin (Russia), 1995 Kobe (Japan), 1997 Ardekul (Iran), 1997 Manyi (Tibet), 1999 Hector Mine (CA), and 2001 Kunlun (Tibet) earthquakes. We compare features of the slip distributions such as the depth distribution of slip, the inferred magnitude and the degree of heterogeneity of slip over the fault plane, as resolved by the available InSAR and GPS data. We end with a brief description of the data coverage required for future earthquakes of similar size if we want to infer some of the above quantities to within a given confidence interval. We describe both the number of InSAR scenes and the distribution of GPS points that would be required, based on theoretical treatments of the fault plane/data point geometry using the jR_i method.

7. Consequences of early childbearing.

PubMed

Monroy De Velasco, A

1982-12-01

By 2000, developing countries will have an estimated 1 billion adolescents who are physically old enough to reproduce themselves but far too young to be responsible, healthy parents of healthy children. Governments must become involved in the issues surrounding adolescent pregnancy and both custom and the laws must change to reflect the needs of young people. The consequences of early childbearing are felt by society as well as the families directly affected. The incidence of births to very young women, both married and unmarried is growing; each year approximately 13 million children are born to young mothers. The percentage of live births to mothers under the age of 20 ranges from 20% in some African and Caribbean countries, to 10-15% in many Latin American countries, 5-10% in Asia, and 1% in Japan. Increased out-of-wedlock adolescent pregnancy is due to many factors: earlier sexual maturity from better childhood health and nutrition, a trend toward later marriage, increased opportunity for opposite sex interaction in schools and in the labor force, and rapid urbanization which weakens traditional family structures and social and cultural controls. Early childbirth is especially dangerous for adolescents and their infants. Compared to women between the ages of 20-35, pregnant women under 20 are at a greater risk for death and disease including bleeding during pregnancy, toxemia, hemorrhage, prolonged and difficult labor, severe anemia, and disability. Life-long social and economic disadvantages may be a consequence of teenage birth. Educational and career opportunities may be limited, as may be opportunities for marriage. Teen mothers tend to have larger completed family sizes, shorter birth intervals resulting in both poorer health status for the family, and a more severe level of poverty. The children also suffer; teens mothers have a higher incidence of low birth weight infants which is associated with birth injuries, serious childhood illness, and mental and

8. St. Louis Area Earthquake Hazards Mapping Project

Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

2007-01-01

St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

9. Incubation of Chile's 1960 Earthquake

NASA Astrophysics Data System (ADS)

Atwater, B. F.; Cisternas, M.; Salgado, I.; Machuca, G.; Lagos, M.; Eipert, A.; Shishikura, M.

2003-12-01

Infrequent occurrence of giant events may help explain how the 1960 Chile earthquake attained M 9.5. Although old documents imply that this earthquake followed great earthquakes of 1575, 1737 and 1837, only three earthquakes of the past 1000 years produced geologic records like those for 1960. These earlier earthquakes include the 1575 event but not 1737 or 1837. Because the 1960 earthquake had nearly twice the seismic slip expected from plate convergence since 1837, much of the strain released in 1960 may have been accumulating since 1575. Geologic evidence for such incubation comes from new paleoseismic findings at the R¡o Maullin estuary, which indents the Pacific coast at 41.5§ S midway along the 1960 rupture. The 1960 earthquake lowered the area by 1.5 m, and the ensuing tsunami spread sand across lowland soils. The subsidence killed forests and changed pastures into sandy tidal flats. Guided by these 1960 analogs, we inferred tsunami and earthquake history from sand sheets, tree rings, and old maps. At Chuyaquen, 10 km upriver from the sea, we studied sand sheets in 31 backhoe pits on a geologic transect 1 km long. Each sheet overlies the buried soil of a former marsh or meadow. The sand sheet from 1960 extends the entire length of the transect. Three earlier sheets can be correlated at least half that far. The oldest one, probably a tsunami deposit, surrounds herbaceous plants that date to AD 990-1160. Next comes a sandy tidal-flat deposit dated by stratigraphic position to about 1000-1500. The penultimate sheet is a tsunami deposit younger than twigs from 1410-1630. It probably represents the 1575 earthquake, whose accounts of shaking, tsunami, and landslides rival those of 1960. In that case, the record excludes the 1737 and 1837 events. The 1737 and 1837 events also appear missing in tree-ring evidence from islands of Misquihue, 30 km upriver from the sea. Here the subsidence in 1960 admitted brackish tidal water that defoliated tens of thousands of

10. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

NASA Astrophysics Data System (ADS)

Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

2008-12-01

Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

11. New Field Observations About 19 August 1966 Varto earthquake, Eastern Turkey

NASA Astrophysics Data System (ADS)

Gurboga, S.

2013-12-01

Some destructive earthquakes in the past and even in the recent have several mysteries. For example, magnitude, epicenter location, faulting type and source fault of an earthquake have not been detected yet. One of these mysteries events is 19 August 1966 Varto earthquake in Turkey. 19 August 1966 Varto earthquake (Ms = 6.8) was an extra ordinary event at the 40 km east of junction between NAFS and EAFS which are two seismogenic system and active structures shaping the tectonics of Turkey. This earthquake sourced from Varto fault zone which are approximately 4 km width and 43 km length. It consists of faults which have parallel to sub-parallel, closely-spaced, north and south-dipping up to 85°-88° dip amount. Although this event has 6.8 (Ms) magnitude that is big enough to create a surface rupture, there was no clear surface deformation had been detected. This creates the controversial issue about the source fault and the mechanism of the earthquake. According to Wallace (1968) the type of faulting is right-lateral. On the other hand, McKenzie (1972) proposed right-lateral movement with thrust component by using the focal mechanism solution. The recent work done by Sançar et al. (2011) claimed that type of faulting is pure right-lateral strike-slip and there is no any surface rupture during the earthquake. Furthermore, they suggested that Varto segment in the Varto Fault Zone was most probably not broken in 1966 earthquake. This study is purely focused on the field geology and trenching survey for the investigation of 1966 Varto earthquake. Four fault segments have been mapped along the Varto fault zone: Varto, Sazlica, Leylekdağ and Çayçati segments. Because of the thick volcanic cover on the area around Varto, surface rupture has only been detected by trenching survey. Two trenching survey have been applied along the Yayikli and Ağaçalti faults in the Varto fault zone. Consequently, detailed geological work in the field and trenching survey indicate that

12. Fault failure with moderate earthquakes

Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

1987-01-01

High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

13. Statistical earthquake focal mechanism forecasts

NASA Astrophysics Data System (ADS)

Kagan, Yan Y.; Jackson, David D.

2014-04-01

Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

14. Izmit, Turkey 1999 Earthquake Interferogram

2001-03-30

This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul. http://photojournal.jpl.nasa.gov/catalog/PIA00557

15. Izmit, Turkey 1999 Earthquake Interferogram

NASA Technical Reports Server (NTRS)

2001-01-01

This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul.

16. Modeling, Forecasting and Mitigating Extreme Earthquakes

NASA Astrophysics Data System (ADS)

Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

2012-12-01

Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

17. Earthquakes in the New Zealand Region.

ERIC Educational Resources Information Center

Wallace, Cleland

1995-01-01

Presents a thorough overview of earthquakes in New Zealand, discussing plate tectonics, seismic measurement, and historical occurrences. Includes 10 figures illustrating such aspects as earthquake distribution, intensity, and fissures in the continental crust. Tabular data includes a list of most destructive earthquakes and descriptive effects…

18. 13 CFR 120.174 - Earthquake hazards.

Code of Federal Regulations, 2013 CFR

2013-01-01

... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

19. 13 CFR 120.174 - Earthquake hazards.

Code of Federal Regulations, 2014 CFR

2014-01-01

... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

20. Understanding and responding to earthquake hazards

NASA Technical Reports Server (NTRS)

Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

2002-01-01

Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

1. 13 CFR 120.174 - Earthquake hazards.

Code of Federal Regulations, 2012 CFR

2012-01-01

... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

2. 13 CFR 120.174 - Earthquake hazards.

Code of Federal Regulations, 2011 CFR

2011-01-01

... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

3. Impact of the 2016 Ecuador Earthquake on Zika Virus Cases

PubMed Central

Vasquez, Diego; Palacio, Ana; Nuñez, Jose; Briones, Wladimir; Beier, John C.; Tamariz, Leonardo

2017-01-01

Objectives. To evaluate the impact of the April 2016 7.8-magnitude earthquake in Ecuador on the incidence of Zika virus (ZIKV) cases. Methods. We used the national public health surveillance system for reportable transmissible conditions and included suspected and laboratory-confirmed ZIKV cases. We compared the number of cases before and after the earthquake in areas closer to and farther from the epicenter. Results. From January to July 2016, 2234 patients suspected of having ZIKV infection were reported in both affected and control areas. A total of 1110 patients had a reverse transcription-polymerase chain reaction assay, and 159 were positive for ZIKV. The cumulative incidence of ZIKV in the affected area was 11.1 per 100 000 after the earthquake. The odds ratio of having ZIKV infection in those living in the affected area was 8.0 (95% CI = 4.4, 14.6; P < .01) compared with the control area and adjusted for age, gender, province population, and number of government health care facilities. Conclusions. A spike in ZIKV cases occurred after the earthquake. Patients in the area closest to the epicenter had a delay in seeking care. PMID:28520489

4. Earth science: lasting earthquake legacy

Parsons, Thomas E.

2009-01-01

On 31 August 1886, a magnitude-7 shock struck Charleston, South Carolina; low-level activity continues there today. One view of seismic hazard is that large earthquakes will return to New Madrid and Charleston at intervals of about 500 years. With expected ground motions that would be stronger than average, that prospect produces estimates of earthquake hazard that rival those at the plate boundaries marked by the San Andreas fault and Cascadia subduction zone. The result is two large 'bull's-eyes' on the US National Seismic Hazard Maps — which, for example, influence regional building codes and perceptions of public safety.

5. Earthquakes triggered by fluid extraction

Segall, P.

1989-01-01

Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author

6. Earthquake location in island arcs

Engdahl, E.R.; Dewey, J.W.; Fujita, K.

1982-01-01

A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

7. Earthquake Education in Prime Time

NASA Astrophysics Data System (ADS)

de Groot, R.; Abbott, P.; Benthien, M.

2004-12-01

Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

8. Utah FORGE Site Earthquake Animation

SciT

Joe Moore

This is a .kml earthquake animation covering the period of 1991 - 2011 for the Utah Milford FORGE site. It displays seismic events using different sized bubbles according to magnitude. It covers the general Utah FORGE area (large shaded rectangle) with the final site displayed as a smaller polygon along the northwestern margin. Earthquakes are subdivide into clusters and the time, date, and magnitude of each event is included. Nearby seismic stations are symbolized with triangles. This was created by the University of Utah Seismograph Stations (UUSS).

9. Advancing Integrated STEM Learning through Engineering Design: Sixth-Grade Students' Design and Construction of Earthquake Resistant Buildings

ERIC Educational Resources Information Center

English, Lyn D.; King, Donna; Smeed, Joanna

2017-01-01

As part of a 3-year longitudinal study, 136 sixth-grade students completed an engineering-based problem on earthquakes involving integrated STEM learning. Students employed engineering design processes and STEM disciplinary knowledge to plan, sketch, then construct a building designed to withstand earthquake damage, taking into account a number of…

10. VLF/LF Amplitude Perturbations before Tuscany Earthquakes, 2013

NASA Astrophysics Data System (ADS)

Khadka, Balaram; Kandel, Keshav Prasad; Pant, Sudikshya; Bhatta, Karan; Ghimire, Basu Dev

2017-12-01

The US Navy VLF/LF Transmitter's NSY signal (45.9 kHz) transmitted from Niscemi, Sicily, Italy, and received at the Kiel Long Wave Monitor, Germany, was analyzed for the period of two months, May and June (EQ-month) of 2013. There were 12 earthquakes of magnitude greater than 4 that hit Italy in these two months, of which the earthquake of 21st June having magnitude of 5.2 and a shallow focal depth of 5 km was the major one. We studied the earthquake of 21st of June 2013, which struck Tuscany, Central Italy, (44.1713°N and 10.2082°E) at 10:33 UT, and also analyzed the effects of this earthquake on the sub-ionos- pheric VLF/LF signals. In addition, we also studied another earthquake, of magnitude 4.9, which hit the same place at 14:40 UT on 30th of June and had shallow focal depth of 10 km. We assessed the data using terminator time (TT) method and night time fluctuation method and found unusual changes in VLF/LF amplitudes/phases. Analysis of trend, night time dispers! ion, and night time fluctuation was also carried and several anomalies were detected. Most ionospheric perturbations in these parameters were found in the month of June, from few days to few weeks prior to the earthquakes. Moreover, we filtered the possible effects due to geomagnetic storms, auroras, and solar activities using parameters like Dst index, AE index, and Kp index for analyzing the geomagnetic effects, and Bz (sigma) index, sunspot numbers, and solar index F10.7 for analyzing the solar activities for the confirmation of anomalies as precursors.

11. Advanced Simulation of Coupled Earthquake and Tsunami Events

NASA Astrophysics Data System (ADS)

Behrens, Joern

2013-04-01

Tsunami-Earthquakes represent natural catastrophes threatening lives and well-being of societies in a solitary and unexpected extreme event as tragically demonstrated in Sumatra (2004), Samoa (2009), Chile (2010), or Japan (2011). Both phenomena are consequences of the complex system of interactions of tectonic stress, fracture mechanics, rock friction, rupture dynamics, fault geometry, ocean bathymetry, and coastline geometry. The ASCETE project forms an interdisciplinary research consortium that couples the most advanced simulation technologies for earthquake rupture dynamics and tsunami propagation to understand the fundamental conditions of tsunami generation. We report on the latest research results in physics-based dynamic rupture and tsunami wave propagation simulation, using unstructured and adaptive meshes with continuous and discontinuous Galerkin discretization approaches. Coupling both simulation tools - the physics-based dynamic rupture simulation and the hydrodynamic tsunami wave propagation - will give us the possibility to conduct highly realistic studies of the interaction of rupture dynamics and tsunami impact characteristics.

12. 4 Earthquake: Major offshore earthquakes recall the Aztec myth

,

1970-01-01

Long before the sun clears the eastern mountains of April 29, 1970, the savanna highlands of Chiapas tremble from a magnitude 6.7 earthquake centered off the Pacific coast near Mexico’s southern border. Then, for a few hours, he Isthmus of Tehuantepec is quiet.

13. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

ERIC Educational Resources Information Center

Egan, Candice J.; Quigley, Mark C.

2015-01-01

The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

14. Earthquake Hazard and Risk in Alaska

NASA Astrophysics Data System (ADS)

Black Porto, N.; Nyst, M.

2014-12-01

Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

15. Understanding intraplate earthquakes in Sweden: the where and why

NASA Astrophysics Data System (ADS)

Lund, Björn; Tryggvason, Ari; Chan, NeXun; Högdahl, Karin; Buhcheva, Darina; Bödvarsson, Reynir

2016-04-01

The Swedish National Seismic Network (SNSN) underwent a rapid expansion and modernization between the years 2000 - 2010. The number of stations increased from 6 to 65, all broadband or semi-broadband with higher than standard sensitivity and all transmitting data in real-time. This has lead to a significant increase in the number of detected earthquakes, with the magnitude of completeness being approximately ML 0.5 within the network. During the last 15 years some 7,300 earthquakes have been detected and located, which can be compared to the approximately 1,800 earthquakes in the Swedish catalog from 1375 to 1999. We have used the recent earthquake catalog and various antropogenic sources (e.g. mine blasts, quarry blasts and infrastructure construction blast) to derive low resolution 3D P- and S-wave velocity models for entire Sweden. Including the blasts provides a more even geographical distribution of sources as well as good constraints on the locations. The resolution of the derived velocity models is in the 20 km range in the well resolved areas. A fairly robust feature observed in the Vp/Vs ratio of the derived models is a difference between the Paleoproterozoic rocks belonging to the TIB (Transscanidinavian Igneous Belt) and the Svecofennian rocks east and north of this region (a Vp/Vs ratio about 1.72 prevail in the former compared to a value below 1.70 in the latter) at depths down to 15 km. All earthquakes occurring since 2000 have been relocated in the 3D velocity model. The results show very clear differences in how earthquakes occur in different parts of Sweden. In the north, north of approximately 64 degrees latitude, most earthquakes occur on or in the vicinity of the Holocene postglacial faults. From 64N to approximately 60N earthquake activity is concentrated along the northeast coast line, with some relation to the offset in the bedrock from the onshore area to the offshore Bay of Bothnia. In southern Sweden earthquake activity is more widely

16. Earthquake forecasts for the CSEP Japan experiment based on the RI algorithm

NASA Astrophysics Data System (ADS)

Nanjo, K. Z.

2011-03-01

An earthquake forecast testing experiment for Japan, the first of its kind, is underway within the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) under a controlled environment. Here we give an overview of the earthquake forecast models, based on the RI algorithm, which we have submitted to the CSEP Japan experiment. Models have been submitted to a total of 9 categories, corresponding to 3 testing classes (3 years, 1 year, and 3 months) and 3 testing regions. The RI algorithm is originally a binary forecast system based on the working assumption that large earthquakes are more likely to occur in the future at locations of higher seismicity in the past. It is based on simple counts of the number of past earthquakes, which is called the Relative Intensity (RI) of seismicity. To improve its forecast performance, we first expand the RI algorithm by introducing spatial smoothing. We then convert the RI representation from a binary system to a CSEP-testable model that produces forecasts for the number of earthquakes of predefined magnitudes. We use information on past seismicity to tune the parameters. The final submittal consists of 36 executable computer codes: 4 variants corresponding to different smoothing parameters for each of the 9 categories. They will help to elucidate which categories and which smoothing parameters are the most meaningful for the RI hypothesis. The main purpose of our participation in the experiment is to better understand the significance of the relative intensity of seismicity for earthquake forecastability in Japan.

17. Testing hypotheses of earthquake occurrence

NASA Astrophysics Data System (ADS)

Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

2003-12-01

We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

18. Source properties of earthquakes near the Salton Sea triggered by the 16 October 1999 M 7.1 Hector Mine, California, earthquake

Hough, S.E.; Kanamori, H.

2002-01-01

We analyze the source properties of a sequence of triggered earthquakes that occurred near the Salton Sea in southern California in the immediate aftermath of the M 7.1 Hector Mine earthquake of 16 October 1999. The sequence produced a number of early events that were not initially located by the regional network, including two moderate earthquakes: the first within 30 sec of the P-wave arrival and a second approximately 10 minutes after the mainshock. We use available amplitude and waveform data from these events to estimate magnitudes to be approximately 4.7 and 4.4, respectively, and to obtain crude estimates of their locations. The sequence of small events following the initial M 4.7 earthquake is clustered and suggestive of a local aftershock sequence. Using both broadband TriNet data and analog data from the Southern California Seismic Network (SCSN), we also investigate the spectral characteristics of the M 4.4 event and other triggered earthquakes using empirical Green's function (EGF) analysis. We find that the source spectra of the events are consistent with expectations for tectonic (brittle shear failure) earthquakes, and infer stress drop values of 0.1 to 6 MPa for six M 2.1 to M 4.4 events. The estimated stress drop values are within the range observed for tectonic earthquakes elsewhere. They are relatively low compared to typically observed stress drop values, which is consistent with expectations for faulting in an extensional, high heat flow regime. The results therefore suggest that, at least in this case, triggered earthquakes are associated with a brittle shear failure mechanism. This further suggests that triggered earthquakes may tend to occur in geothermal-volcanic regions because shear failure occurs at, and can be triggered by, relatively low stresses in extensional regimes.

19. Gambling scores for earthquake predictions and forecasts

NASA Astrophysics Data System (ADS)

Zhuang, Jiancang

2010-04-01

This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

20. @INGVterremoti: Tweeting the Automatic Detection of Earthquakes

NASA Astrophysics Data System (ADS)

Casarotti, E.; Amato, A.; Comunello, F.; Lauciani, V.; Nostro, C.; Polidoro, P.

2014-12-01

The use of social media is emerging as a powerful tool fordisseminating trusted information about earthquakes. Since 2009, theTwitter account @INGVterremoti provides constant and timely detailsabout M2+ seismic events detected by the Italian National SeismicNetwork, directly connected with the seismologists on duty at IstitutoNazionale di Geofisica e Vulcanologia (INGV). After the 2012 seismicsequence, the account has been awarded by a national prize as the"most useful Twitter account". Currently, it updates more than 110,000followers (one the first 50 Italian Twitter accounts for number offollowers). Nevertheless, since it provides only the manual revisionof seismic parameters, the timing (approximately between 10 and 20minutes after an event) has started to be under evaluation.Undeniably, mobile internet, social network sites and Twitter in particularrequire a more rapid and "real-time" reaction.During the last 18 months, INGV tested the tweeting of the automaticdetection of M3+ earthquakes, obtaining results reliable enough to bereleased openly 1 or 2 minutes after a seismic event. During the summerof 2014, INGV, with the collaboration of CORIS (Department ofCommunication and Social Research, Sapienza University of Rome),involved the followers of @INGVterremoti and citizens, carrying out aquali-quantitative study (through in-depth interviews and a websurvey) in order to evaluate the best format to deliver suchinformation. In this presentation we will illustrate the results of the reliability test and theanalysis of the survey.

1. Tweeting Earthquakes using TensorFlow

NASA Astrophysics Data System (ADS)

Casarotti, E.; Comunello, F.; Magnoni, F.

2016-12-01

The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

2. Scaling in geology: landforms and earthquakes.

PubMed Central

Turcotte, D L

1995-01-01

Landforms and earthquakes appear to be extremely complex; yet, there is order in the complexity. Both satisfy fractal statistics in a variety of ways. A basic question is whether the fractal behavior is due to scale invariance or is the signature of a broadly applicable class of physical processes. Both landscape evolution and regional seismicity appear to be examples of self-organized critical phenomena. A variety of statistical models have been proposed to model landforms, including diffusion-limited aggregation, self-avoiding percolation, and cellular automata. Many authors have studied the behavior of multiple slider-block models, both in terms of the rupture of a fault to generate an earthquake and in terms of the interactions between faults associated with regional seismicity. The slider-block models exhibit a remarkably rich spectrum of behavior; two slider blocks can exhibit low-order chaotic behavior. Large numbers of slider blocks clearly exhibit self-organized critical behavior. Images Fig. 6 PMID:11607562

3. Sterilization and its consequences.

PubMed

Hendrix, N W; Chauhan, S P; Morrison, J C

1999-12-01

The purpose of this review is to analyze critically the two techniques of sterilization (bilateral tubal ligation [BTL] and vasectomy) so that a physician may provide informed consent about methods of sterilization. A MEDLINE search and extensive review of published literature dating back to 1966 was undertaken to compare preoperative counseling, operative procedures, postoperative complications, procedure-related costs, psychosocial consequences, and feasibility of reversal between BTL and a vasectomy. Compared with a vasectomy, BTL is 20 times more likely to have major complications, 10 to 37 times more likely to fail, and cost three times as much. Moreover, the procedure-related mortality, although rare, is 12 times higher with sterilization of the woman than of the man. Despite these advantages, 300,000 more BTLs were done in 1987 than vasectomies. In 1987, there were 976,000 sterilizations (65 percent BTLs and 35 percent vasectomies) with an overall cost of \$1.8 billion. Over \$260 million could have been saved if equal numbers of vasectomies and BTLs had been performed, or more than \$800 million if 80 percent had been vasectomies, as was the case in 1971. The safest, most efficacious, and least expensive method of sterilization is vasectomy. For these reasons, physicians should recommend vasectomy when providing counseling on sterilization, despite the popularity of BTL. Obstetricians & Gynecologists, Family Physicians After completion of this article, the reader will be able to predict the failure rates and likelihood of successful reversal of tubal ligation and vasectomy; to recall the difference in cost between the two sterilization procedures, and to describe the short-term and long-term complications associated with each of the two methods of sterilization.

4. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

NASA Astrophysics Data System (ADS)

Bergen, K.; Beroza, G. C.

2016-12-01

New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

5. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

NASA Astrophysics Data System (ADS)

Wurman, G.; Price, M.

2014-12-01

In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

6. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

NASA Astrophysics Data System (ADS)

Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

2012-12-01

Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

7. Local seismicity preceding the March 14, 1979, Petatlan, Mexico Earthquake (Ms = 7.6)

NASA Astrophysics Data System (ADS)

Hsu, Vindell; Gettrust, Joseph F.; Helsley, Charles E.; Berg, Eduard

1983-05-01

Local seismicity surrounding the epicenter of the March 14, 1979, Petatlan, Mexico earthquake was monitored by a network of portable seismographs of the Hawaii Institute of Geophysics from 6 weeks before to 4 weeks after the main shock. Prior to the main shock, the recorded local seismic activity was shallow and restricted within the continental plate above the Benioff zone. The relocated main shock hypocenter also lay above the Benioff zone, suggesting an initial failure within the continental lithosphere. Four zones can be recognized that showed relatively higher seismic activity than the background. Activity within these zones has followed a number of moderate earthquakes that occurred before or after the initial deployment of the network. Three of these moderate earthquakes were near the Mexican coastline and occurred sequentially from southeast to northwest during the three months before the Petatlan earthquake. The Petatlan event occurred along the northwestern extension of this trend. We infer a possible connection between this observed earthquake migration pattern and the subduction of a fracture zone because the 200-km segment that includes the aftershock zones of the Petatlan earthquake and the three preceding moderate earthquakes matches the intersection of the southeastern limb of the Orozco Fracture Zone and the Middle America Trench. The Petatlan earthquake source region includes the region of the last of the three near-coast seismic activities (zone A). Earthquakes of zone A migrated toward the Petatlan main shock epicenter and were separated from it by an aseismic zone about 10 km wide. We designate this group of earthquakes as the foreshocks of the Petatlan earthquake. These foreshocks occurred within the continental lithosphere and their observed characteristics are interpreted as due to the high-stress environment before the main shock. Pre-main shock seismicity of the Petatlan earthquake source region shows a good correlation with the

8. Potentially induced earthquakes during the early twentieth century in the Los Angeles Basin

Hough, Susan E.; Page, Morgan T.

2016-01-01

Recent studies have presented evidence that early to mid‐twentieth‐century earthquakes in Oklahoma and Texas were likely induced by fossil fuel production and/or injection of wastewater (Hough and Page, 2015; Frohlich et al., 2016). Considering seismicity from 1935 onward, Hauksson et al. (2015) concluded that there is no evidence for significant induced activity in the greater Los Angeles region between 1935 and the present. To explore a possible association between earthquakes prior to 1935 and oil and gas production, we first revisit the historical catalog and then review contemporary oil industry activities. Although early industry activities did not induce large numbers of earthquakes, we present evidence for an association between the initial oil boom in the greater Los Angeles area and earthquakes between 1915 and 1932, including the damaging 22 June 1920 Inglewood and 8 July 1929 Whittier earthquakes. We further consider whether the 1933 Mw 6.4 Long Beach earthquake might have been induced, and show some evidence that points to a causative relationship between the earthquake and activities in the Huntington Beach oil field. The hypothesis that the Long Beach earthquake was either induced or triggered by an foreshock cannot be ruled out. Our results suggest that significant earthquakes in southern California during the early twentieth century might have been associated with industry practices that are no longer employed (i.e., production without water reinjection), and do not necessarily imply a high likelihood of induced earthquakes at the present time.

9. On the feedback between forearc morphotectonics and megathrust earthquakes in subduction zones

NASA Astrophysics Data System (ADS)

Rosenau, M.; Oncken, O.

2008-12-01

An increasing number of observations suggest an intrinsic relationship between short- and long-term deformation processes in subduction zones. These include the global correlation between megathrust earthquake slip patterns with morphotectonic forearc features, the historical predominance of giant earthquakes (M > 9) along accretionary margins and the occurrence of (slow and shallow) tsunami earthquakes along erosive margins. To gain insight into the interplay between seismogenesis and tectonics in subduction settings we have developed a new modeling technique which joins analog and elastic dislocation approaches. Using elastoplastic wedges overlying a rate- and state-dependent interface, we demonstrate how analog earthquakes drive permanent wedge deformation consistent with the dynamic Coulomb wedge theory and how wedge deformation in turn controls basal "seismicity". During an experimental run, elastoplastic wedges evolve from those comparable to accretionary margins, characterized by plastic wedge shortening, to those mimicking erosive margins, characterized by minor plastic deformation. Permanent shortening localizes at the periphery of the "seismogenic" zone leading to a "morphotectonic" segmentation of the upper plate. Along with the evolving segmentation of the wedge, the magnitude- frequency relationship and recurrence distribution of analog earthquakes develop towards more periodic events of similar size (i.e. characteristic earthquakes). From the experiments we infer a positive feedback between short- and long-term deformation processes which tends to stabilize the spatiotemporal patterns of elastoplastic deformation in subduction settings. We suggest (1) that forearc anatomy reflects the distribution of seismic and aseismic slip at depth, (2) that morphotectonic segmentation assists the occurrence of more characteristic earthquakes, (3) that postseismic near-trench shortening relaxes coseismic compression by megathrust earthquakes and thus reduces

10. Intrastab Earthquakes: Dehydration of the Cascadia Slab

Preston, L.A.; Creager, K.C.; Crosson, R.S.; Brocher, T.M.; Trehu, A.M.

2003-01-01

We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intrastab earthquakes into two groups, permitting a new understanding of the origins of intrastab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation.

11. Earthquake predictions using seismic velocity ratios

Sherburne, R. W.

1979-01-01

Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency.

12. Earthquakes-Rattling the Earth's Plumbing System

Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

2003-01-01

Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

13. Measuring the size of an earthquake

Spence, W.

1977-01-01

Earthquakes occur in a broad range of sizes. A rock burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat island earthquake in the Aleutian arc involved a 650-kilometer lenght of Earth's crust. Earthquakes can be even smaller and even larger. if an earthquake is felt or causes perceptible surface damage, then its intesnity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic area or at great focal depths. These are either simply not felt or their felt pattern does not really indicate their true size.

14. Pre-earthquake magnetic pulses

NASA Astrophysics Data System (ADS)

Scoville, J.; Heraud, J.; Freund, F.

2015-08-01

A semiconductor model of rocks is shown to describe unipolar magnetic pulses, a phenomenon that has been observed prior to earthquakes. These pulses are suspected to be generated deep in the Earth's crust, in and around the hypocentral volume, days or even weeks before earthquakes. Their extremely long wavelength allows them to pass through kilometers of rock. Interestingly, when the sources of these pulses are triangulated, the locations coincide with the epicenters of future earthquakes. We couple a drift-diffusion semiconductor model to a magnetic field in order to describe the electromagnetic effects associated with electrical currents flowing within rocks. The resulting system of equations is solved numerically and it is seen that a volume of rock may act as a diode that produces transient currents when it switches bias. These unidirectional currents are expected to produce transient unipolar magnetic pulses similar in form, amplitude, and duration to those observed before earthquakes, and this suggests that the pulses could be the result of geophysical semiconductor processes.

15. Bam, Iran, Radar Interferometry -- Earthquake

2004-06-25

A magnitude 6.5 earthquake devastated the small city of Bam in southeast Iran on December 26, 2003. The two images from ESA Envisat show similar measures of the radar interferometric correlation in grayscale on the left and in false colors on the right.

16. Detecting Earthquakes--Part 2.

ERIC Educational Resources Information Center

Isenberg, C.; And Others

1983-01-01

Basic concepts associated with seismic wave propagation through the earth and the location of seismic events were explained in part 1 (appeared in January 1983 issue). This part focuses on the construction of a student seismometer for detecting earthquakes and underground nuclear explosions anywhere on the earth's surface. (Author/JN)

17. Earthquakes May-June 1980.

Person, W.J.

1981-01-01

The months were seismically active, although only one major event (7.0-7.9) occurred in an unpopulated Philippine Island. Mexico was struck by a 6.3 quake on June 9 killing at least two people. The most significant earthquake in the United States was in the Mammoth Lakes area of California. -from Author

18. Cascadia Earthquake and Tsunami Scenario for California's North Coast

NASA Astrophysics Data System (ADS)

Dengler, L.

2006-12-01

In 1995 the California Division of Mines and Geology (now the California Geological Survey) released a planning scenario for an earthquake on the southern portion of the Cascadia subduction zone (CSZ). This scenario was the 8th and last of the Earthquake Planning Scenarios published by CDMG. It was the largest magnitude CDMG scenario, an 8.4 earthquake rupturing the southern 200 km of the CSZ, and it was the only scenario to include tsunami impacts. This scenario event has not occurred in historic times and depicts impacts far more severe than any recent earthquake. The local tsunami hazard is new; there is no written record of significant local tsunami impact in the region. The north coast scenario received considerable attention in Humboldt and Del Norte Counties and contributed to a number of mitigation efforts. The Redwood Coast Tsunami Work Group (RCTWG), an organization of scientists, emergency managers, government agencies, and businesses from Humboldt, Mendocino, and Del Norte Counties, was formed in 1996 to assist local jurisdictions in understanding the implications of the scenario and to promote a coordinated, consistent mitigation program. The group has produced print and video materials and promoted response and evacuation planning. Since 1997 the RCTWG has sponsored an Earthquake Tsunami Education Room at county fairs featuring preparedness information, hands-on exhibits and regional tsunami hazard maps. Since the development of the TsunamiReady Program in 2001, the RCTWG facilitates community TsunamiReady certification. To assess the effectiveness of mitigation efforts, five telephone surveys between 1993 and 2001 were conducted by the Humboldt Earthquake Education Center. A sixth survey is planned for this fall. Each survey includes between 400 and 600 respondents. Over the nine year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased

19. Frequency-area distribution of earthquake-induced landslides

NASA Astrophysics Data System (ADS)

Tanyas, H.; Allstadt, K.; Westen, C. J. V.

2016-12-01

Discovering the physical explanations behind the power-law distribution of landslides can provide valuable information to quantify triggered landslide events and as a consequence to understand the relation between landslide causes and impacts in terms of environmental settings of landslide affected area. In previous studies, the probability of landslide size was utilized for this quantification and the developed parameter was called a landslide magnitude (mL). The frequency-area distributions (FADs) of several landslide inventories were modelled and theoretical curves were established to identify the mL for any landslide inventory. In the observed landslide inventories, a divergence from the power-law distribution was recognized for the small landslides, referred to as the rollover, and this feature was taken into account in the established model. However, these analyses are based on a relatively limited number of inventories, each with a different triggering mechanism. Existing definition of the mL include some subjectivity, since it is based on a visual comparison between the theoretical curves and the FAD of the medium and large landslides. Additionally, the existed definition of mL introduces uncertainty due to the ambiguity in both the physical explanation of the rollover and its functional form. Here we focus on earthquake-induced landslides (EQIL) and aim to provide a rigorous method to estimate the mL and total landslide area of EQIL. We have gathered 36 EQIL inventories from around the globe. Using these inventories, we have evaluated existing explanations of the rollover and proposed an alternative explanation given the new data. Next, we propose a method to define the EQIL FAD curves, mL and to estimate the total landslide area. We utilize the total landslide areas obtained from inventories to compare them with our estimations and to validate our methodology. The results show that we calculate landslide magnitudes more accurately than previous methods.

20. Earthquake precursors: activation or quiescence?

NASA Astrophysics Data System (ADS)

Rundle, John B.; Holliday, James R.; Yoder, Mark; Sachs, Michael K.; Donnellan, Andrea; Turcotte, Donald L.; Tiampo, Kristy F.; Klein, William; Kellogg, Louise H.

2011-10-01

We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These

1. Predecessors of the giant 1960 Chile earthquake

Cisternas, M.; Atwater, B.F.; Torrejon, F.; Sawai, Y.; Machuca, G.; Lagos, M.; Eipert, A.; Youlton, C.; Salgado, I.; Kamataki, T.; Shishikura, M.; Rajendran, C.P.; Malik, J.K.; Rizal, Y.; Husni, M.

2005-01-01

It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended. ?? 2005 Nature Publishing Group.

2. Predecessors of the giant 1960 Chile earthquake.

PubMed

Cisternas, Marco; Atwater, Brian F; Torrejón, Fernando; Sawai, Yuki; Machuca, Gonzalo; Lagos, Marcelo; Eipert, Annaliese; Youlton, Cristián; Salgado, Ignacio; Kamataki, Takanobu; Shishikura, Masanobu; Rajendran, C P; Malik, Javed K; Rizal, Yan; Husni, Muhammad

2005-09-15

It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended.

3. A note on evaluating VAN earthquake predictions

NASA Astrophysics Data System (ADS)

Tselentis, G.-Akis; Melis, Nicos S.

The evaluation of the success level of an earthquake prediction method should not be based on approaches that apply generalized strict statistical laws and avoid the specific nature of the earthquake phenomenon. Fault rupture processes cannot be compared to gambling processes. The outcome of the present note is that even an ideal earthquake prediction method is still shown to be a matter of a “chancy” association between precursors and earthquakes if we apply the same procedure proposed by Mulargia and Gasperini [1992] in evaluating VAN earthquake predictions. Each individual VAN prediction has to be evaluated separately, taking always into account the specific circumstances and information available. The success level of epicenter prediction should depend on the earthquake magnitude, and magnitude and time predictions may depend on earthquake clustering and the tectonic regime respectively.

4. Earthquake sources near Uturuncu Volcano

NASA Astrophysics Data System (ADS)

Keyson, L.; West, M. E.

2013-12-01

Uturuncu, located in southern Bolivia near the Chile and Argentina border, is a dacitic volcano that was last active 270 ka. It is a part of the Altiplano-Puna Volcanic Complex, which spans 50,000 km2 and is comprised of a series of ignimbrite flare-ups since ~23 ma. Two sets of evidence suggest that the region is underlain by a significant magma body. First, seismic velocities show a low velocity layer consistent with a magmatic sill below depths of 15-20 km. This inference is corroborated by high electrical conductivity between 10km and 30km. This magma body, the so called Altiplano-Puna Magma Body (APMB) is the likely source of volcanic activity in the region. InSAR studies show that during the 1990s, the volcano experienced an average uplift of about 1 to 2 cm per year. The deformation is consistent with an expanding source at depth. Though the Uturuncu region exhibits high rates of crustal seismicity, any connection between the inflation and the seismicity is unclear. We investigate the root causes of these earthquakes using a temporary network of 33 seismic stations - part of the PLUTONS project. Our primary approach is based on hypocenter locations and magnitudes paired with correlation-based relative relocation techniques. We find a strong tendency toward earthquake swarms that cluster in space and time. These swarms often last a few days and consist of numerous earthquakes with similar source mechanisms. Most seismicity occurs in the top 10 kilometers of the crust and is characterized by well-defined phase arrivals and significant high frequency content. The frequency-magnitude relationship of this seismicity demonstrates b-values consistent with tectonic sources. There is a strong clustering of earthquakes around the Uturuncu edifice. Earthquakes elsewhere in the region align in bands striking northwest-southeast consistent with regional stresses.

5. Earthquake-triggered landslides along the Hyblean-Malta Escarpment (off Augusta, eastern Sicily, Italy) - assessment of the related tsunamigenic potential

NASA Astrophysics Data System (ADS)

Ausilia Paparo, Maria; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Tinti, Stefano

2017-02-01

Eastern Sicily is affected by earthquakes and tsunamis of local and remote origin, which is known through numerous historical chronicles. Recent studies have put emphasis on the role of submarine landslides as the direct cause of the main local tsunamis, envisaging that earthquakes (in 1693 and 1908) did produce a tsunami, but also that they triggered mass failures that were able to generate an even larger tsunami. The debate is still open, and though no general consensus has been found among scientists so far, this research had the merit to attract attention on possible generation of tsunamis by landslides off Sicily. In this paper we investigate the tsunami potential of mass failures along one sector of the Hyblean-Malta Escarpment (HME). facing Augusta. The HME is the main offshore geological structure of the region running almost parallel to the coast, off eastern Sicily. Here, bottom morphology and slope steepness favour soil failures. In our work we study slope stability under seismic load along a number of HME transects by using the Minimun Lithostatic Deviation (MLD) method, which is based on the limit-equilibrium theory. The main goal is to identify sectors of the HME that could be unstable under the effect of realistic earthquakes. We estimate the possible landslide volume and use it as input for numerical codes to simulate the landslide motion and the consequent tsunami. This is an important step for the assessment of the tsunami hazard in eastern Sicily and for local tsunami mitigation policies. It is also important in view of tsunami warning system since it can help to identify the minimum earthquake magnitude capable of triggering destructive tsunamis induced by landslides, and therefore to set up appropriate knowledge-based criteria to launch alert to the population.

6. Earthquake prognosis:cause for failure and ways for the problem solution

NASA Astrophysics Data System (ADS)

Kondratiev, O.

2003-04-01

Despite of the more than 50-years history of the development of the prognosis earthquake method this problem is yet not to be resolved. This makes one to have doubt in rightness of the chosen approaches retrospective search of the diverse earthquake precursors. It is obvious to speak of long-term, middle-term and short-term earthquake prognosis. They all have a probabilistic character and it would be more correct to consider them as related to the seismic hazard prognosis. In distinction of them, the problem of the operative prognosis is being discussed in report. The operative prognosis should conclude the opportune presenting of the seismic alarm signal of the place, time and power of the earthquake in order to take necessary measures for maximal mitigation of the catastrophic consequence of this event. To do this it is necessary to predict the earthquake location with accuracy of first dozens of kilometres, time of its occurrence with accuracy of the first days and its power with accuracy of the magnitude units. If the problem is formulated in such a way, it cannot principally be resolved in the framework of the concept of the indirect earthquake precursors using. It is necessary to pass from the concept of the passive observatory network to the concept of the object-oriented search of the potential source zones and direct information obtaining on the parameter medium changes within these zones in the process of the earthquake preparation and development. While formulated in this way, the problem becomes a integrated task for the planet and prospecting geophysics. To detect the source zones it is possible to use the method of the converted waves of earthquakes, for monitoring - seismic reflecting and method of the common point. Arrangement of these and possible other geophysical methods should be provided by organising the special integrated geophysic expedition of the rapid response on the occurred strong earthquakes and conducting purposeful investigation

7. Observing Triggered Earthquakes Across Iran with Calibrated Earthquake Locations

NASA Astrophysics Data System (ADS)

Karasozen, E.; Bergman, E.; Ghods, A.; Nissen, E.

2016-12-01

We investigate earthquake triggering phenomena in Iran by analyzing patterns of aftershock activity around mapped surface ruptures. Iran has an intense level of seismicity (> 40,000 events listed in the ISC Bulletin since 1960) due to it accommodating a significant portion of the continental collision between Arabia and Eurasia. There are nearly thirty mapped surface ruptures associated with earthquakes of M 6-7.5, mostly in eastern and northwestern Iran, offering a rich potential to study the kinematics of earthquake nucleation, rupture propagation, and subsequent triggering. However, catalog earthquake locations are subject to up to 50 km of location bias from the combination of unknown Earth structure and unbalanced station coverage, making it challenging to assess both the rupture directivity of larger events and the spatial patterns of their aftershocks. To overcome this limitation, we developed a new two-tiered multiple-event relocation approach to obtain hypocentral parameters that are minimally biased and have realistic uncertainties. In the first stage, locations of small clusters of well-recorded earthquakes at local spatial scales (100s of events across 100 km length scales) are calibrated either by using near-source arrival times or independent location constraints (e.g. local aftershock studies, InSAR solutions), using an implementation of the Hypocentroidal Decomposition relocation technique called MLOC. Epicentral uncertainties are typically less than 5 km. Then, these events are used as prior constraints in the code BayesLoc, a Bayesian relocation technique that can handle larger datasets, to yield region-wide calibrated hypocenters (1000s of events over 1000 km length scales). With locations and errors both calibrated, the pattern of aftershock activity can reveal the type of the earthquake triggering: dynamic stress changes promote an increase in the seismicity rate in the direction of unilateral propagation, whereas static stress changes should

8. Application of Earthquake Subspace Detectors at Kilauea and Mauna Loa Volcanoes, Hawai`i

NASA Astrophysics Data System (ADS)

Okubo, P.; Benz, H.; Yeck, W.

2016-12-01

Recent studies have demonstrated the capabilities of earthquake subspace detectors for detailed cataloging and tracking of seismicity in a number of regions and settings. We are exploring the application of subspace detectors at the United States Geological Survey's Hawaiian Volcano Observatory (HVO) to analyze seismicity at Kilauea and Mauna Loa volcanoes. Elevated levels of microseismicity and occasional swarms of earthquakes associated with active volcanism here present cataloging challenges due the sheer numbers of earthquakes and an intrinsically low signal-to-noise environment featuring oceanic microseism and volcanic tremor in the ambient seismic background. With high-quality continuous recording of seismic data at HVO, we apply subspace detectors (Harris and Dodge, 2011, Bull. Seismol. Soc. Am., doi: 10.1785/0120100103) during intervals of noteworthy seismicity. Waveform templates are drawn from Magnitude 2 and larger earthquakes within clusters of earthquakes cataloged in the HVO seismic database. At Kilauea, we focus on seismic swarms in the summit caldera region where, despite continuing eruptions from vents in the summit region and in the east rift zone, geodetic measurements reflect a relatively inflated volcanic state. We also focus on seismicity beneath and adjacent to Mauna Loa's summit caldera that appears to be associated with geodetic expressions of gradual volcanic inflation, and where precursory seismicity clustered prior to both Mauna Loa's most recent eruptions in 1975 and 1984. We recover several times more earthquakes with the subspace detectors - down to roughly 2 magnitude units below the templates, based on relative amplitudes - compared to the numbers of cataloged earthquakes. The increased numbers of detected earthquakes in these clusters, and the ability to associate and locate them, allow us to infer details of the spatial and temporal distributions and possible variations in stresses within these key regions of the volcanoes.

9. The August 2011 Virginia and Colorado Earthquake Sequences: Does Stress Drop Depend on Strain Rate?

NASA Astrophysics Data System (ADS)

Abercrombie, R. E.; Viegas, G.

2011-12-01

Our preliminary analysis of the August 2011 Virginia earthquake sequence finds the earthquakes to have high stress drops, similar to those of recent earthquakes in NE USA, while those of the August 2011 Trinidad, Colorado, earthquakes are moderate - in between those typical of interplate (California) and the east coast. These earthquakes provide an unprecedented opportunity to study such source differences in detail, and hence improve our estimates of seismic hazard. Previously, the lack of well-recorded earthquakes in the eastern USA severely limited our resolution of the source processes and hence the expected ground accelerations. Our preliminary findings are consistent with the idea that earthquake faults strengthen during longer recurrence times and intraplate faults fail at higher stress (and produce higher ground accelerations) than their interplate counterparts. We use the empirical Green's function (EGF) method to calculate source parameters for the Virginia mainshock and three larger aftershocks, and for the Trinidad mainshock and two larger foreshocks using IRIS-available stations. We select time windows around the direct P and S waves at the closest stations and calculate spectral ratios and source time functions using the multi-taper spectral approach (eg. Viegas et al., JGR 2010). Our preliminary results show that the Virginia sequence has high stress drops (~100-200 MPa, using Madariaga (1976) model), and the Colorado sequence has moderate stress drops (~20 MPa). These numbers are consistent with previous work in the regions, for example the Au Sable Forks (2002) earthquake, and the 2010 Germantown (MD) earthquake. We also calculate the radiated seismic energy and find the energy/moment ratio to be high for the Virginia earthquakes, and moderate for the Colorado sequence. We observe no evidence of a breakdown in constant stress drop scaling in this limited number of earthquakes. We extend our analysis to a larger number of earthquakes and stations

10. Numerical Simulation of Stress evolution and earthquake sequence of the Tibetan Plateau

NASA Astrophysics Data System (ADS)

Dong, Peiyu; Hu, Caibo; Shi, Yaolin

2015-04-01

lower than certain value. For locations where large earthquakes occurred during the 110 years, the initial stresses can be inverted if the strength is estimated and the tectonic loading is assumed constant. Therefore, although initial stress state is unknown, we can try to make estimate of a range of it. In this study, we estimated a reasonable range of initial stress, and then based on Coulomb-Mohr criterion to regenerate the earthquake sequence, starting from the Daofu earthquake of 1904. We calculated the stress field evolution of the sequence, considering both the tectonic loading and interaction between the earthquakes. Ultimately we got a sketch of the present stress. Of course, a single model with certain initial stress is just one possible model. Consequently the potential seismic hazards distribution based on a single model is not convincing. We made test on hundreds of possible initial stress state, all of them can produce the historical earthquake sequence occurred, and summarized all kinds of calculated probabilities of the future seismic activity. Although we cannot provide the exact state in the future, but we can narrow the estimate of regions where is in high probability of risk. Our primary results indicate that the Xianshuihe fault and adjacent area is one of such zones with higher risk than other regions in the future. During 2014, there were 6 earthquakes (M > 5.0) happened in this region, which correspond with our result in some degree. We emphasized the importance of the initial stress field for the earthquake sequence, and provided a probabilistic assessment for future seismic hazards. This study may bring some new insights to estimate the initial stress, earthquake triggering, and the stress field evolution .

11. Liquefaction-induced lateral spreading in Oceano, California, during the 2003 San Simeon Earthquake

Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.; Di Alessandro, Carola; Boatwright, John; Tinsley, John C.; Sell, Russell W.; Rosenberg, Lewis I.

2004-01-01

the 2003 damage was caused by lateral spreading in two separate areas, one near Norswing Drive and the other near Juanita Avenue. The areas coincided with areas with the highest liquefaction potential found in Oceano. Areas with site amplification conditions similar to those in Oceano are particularly vulnerable to earthquakes. Site amplification may cause shaking from distant earthquakes, which normally would not cause damage, to increase locally to damaging levels. The vulnerability in Oceano is compounded by the widespread distribution of highly liquefiable soils that will reliquefy when ground shaking is amplified as it was during the San Simeon earthquake. The experience in Oceano can be expected to repeat because the region has many active faults capable of generating large earthquakes. In addition, liquefaction and lateral spreading will be more extensive for moderate-size earthquakes that are closer to Oceano than was the 2003 San Simeon earthquake. Site amplification and liquefaction can be mitigated. Shaking is typically mitigated in California by adopting and enforcing up-to-date building codes. Although not a guarantee of safety, application of these codes ensures that the best practice is used in construction. Building codes, however, do not always require the upgrading of older structures to new code requirements. Consequently, many older structures may not be as resistant to earthquake shaking as new ones. For older structures, retrofitting is required to bring them up to code. Seismic provisions in codes also generally do not apply to nonstructural elements such as drywall, heating systems, and shelving. Frequently, nonstructural damage dominates the earthquake loss. Mitigation of potential liquefaction in Oceano presently is voluntary for existing buildings, but required by San Luis Obispo County for new construction. Multiple mitigation procedures are available to individual property owners. These procedures typically involve either

12. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

Coordinated by Bakun, William H.; Prescott, William H.

1993-01-01

Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

13. The livelihoods of Haitian health-care providers after the january 2010 earthquake: a pilot study of the economic and quality-of-life impact of emergency relief.

PubMed

Haar, Rohini J; Naderi, Sassan; Acerra, John R; Mathias, Maxwell; Alagappan, Kumar

2012-03-02

An effective international response to a disaster requires cooperation and coordination with the existing infrastructure. In some cases, however, international relief efforts can compete with the local work force and affect the balance of health-care systems already in place. This study seeks to evaluate the impact of the international humanitarian response to the 12 January 2010 earthquake on Haitian health-care providers (HHP). Fifty-nine HHPs were surveyed in August of 2010 using a modified World Health Organization Quality of Life-Brief questionnaire (WHOQoL-B) that included questions on respondents' workload before the earthquake, immediately after, and presently. The study population consisted of physicians, nurses, and technicians at public hospitals, non-governmental organization (NGO) clinics, and private offices in Port-au-Prince, Haiti. Following the earthquake, public hospital and NGO providers reported a significant increase in their workload (15 of 17 and 22 of 26 respondents, respectively). Conversely, 12 of 16 private providers reported a significant decrease in workload (p < 0.0001). Although all groups reported working a similar number of hours prior to the earthquake (average 40 h/week), they reported working significantly different amounts following the earthquake. Public hospital and NGO providers averaged more than 50 h/week, and private providers averaged just over 33 h/week of employment (p < 0.001).Health-care providers working at public hospitals and NGOs, however, had significantly lower scores on the WHOQoL-B when answering questions about their environment (p < 0.001), and in open-ended responses often commented about the lack of potable water and poor access to toilets. Providers from all groups expressed dissatisfaction with the scope and quality of care provided at public hospitals and NGO clinics, as well as disappointment with the reduction in patient volume at private practices. The emergency medical response to the January 2010

14. Rapid estimate of earthquake source duration: application to tsunami warning.

NASA Astrophysics Data System (ADS)

Reymond, Dominique; Jamelot, Anthony; Hyvernaud, Olivier

2016-04-01

We present a method for estimating the source duration of the fault rupture, based on the high-frequency envelop of teleseismic P-Waves, inspired from the original work of (Ni et al., 2005). The main interest of the knowledge of this seismic parameter is to detect abnormal low velocity ruptures that are the characteristic of the so called 'tsunami-earthquake' (Kanamori, 1972). The validation of the results of source duration estimated by this method are compared with two other independent methods : the estimated duration obtained by the Wphase inversion (Kanamori and Rivera, 2008, Duputel et al., 2012) and the duration calculated by the SCARDEC process that determines the source time function (M. Vallée et al., 2011). The estimated source duration is also confronted to the slowness discriminant defined by Newman and Okal, 1998), that is calculated routinely for all earthquakes detected by our tsunami warning process (named PDFM2, Preliminary Determination of Focal Mechanism, (Clément and Reymond, 2014)). Concerning the point of view of operational tsunami warning, the numerical simulations of tsunami are deeply dependent on the source estimation: better is the source estimation, better will be the tsunami forecast. The source duration is not directly injected in the numerical simulations of tsunami, because the cinematic of the source is presently totally ignored (Jamelot and Reymond, 2015). But in the case of a tsunami-earthquake that occurs in the shallower part of the subduction zone, we have to consider a source in a medium of low rigidity modulus; consequently, for a given seismic moment, the source dimensions will be decreased while the slip distribution increased, like a 'compact' source (Okal, Hébert, 2007). Inversely, a rapid 'snappy' earthquake that has a poor tsunami excitation power, will be characterized by higher rigidity modulus, and will produce weaker displacement and lesser source dimensions than 'normal' earthquake. References: CLément, J

15. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

NASA Astrophysics Data System (ADS)

Jordan, Thomas H.

2013-04-01

Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

16. Development a heuristic method to locate and allocate the medical centers to minimize the earthquake relief operation time.

PubMed

Aghamohammadi, Hossein; Saadi Mesgari, Mohammad; Molaei, Damoon; Aghamohammadi, Hasan

2013-01-01

Location-allocation is a combinatorial optimization problem, and is defined as Non deterministic Polynomial Hard (NP) hard optimization. Therefore, solution of such a problem should be shifted from exact to heuristic or Meta heuristic due to the complexity of the problem. Locating medical centers and allocating injuries of an earthquake to them has high importance in earthquake disaster management so that developing a proper method will reduce the time of relief operation and will consequently decrease the number of fatalities. This paper presents the development of a heuristic method based on two nested genetic algorithms to optimize this location allocation problem by using the abilities of Geographic Information System (GIS). In the proposed method, outer genetic algorithm is applied to the location part of the problem and inner genetic algorithm is used to optimize the resource allocation. The final outcome of implemented method i